Mar 18 16:42:07.142544 ip-10-0-130-255 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Mar 18 16:42:07.142557 ip-10-0-130-255 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Mar 18 16:42:07.142564 ip-10-0-130-255 systemd[1]: kubelet.service: Failed with result 'resources'. Mar 18 16:42:07.142777 ip-10-0-130-255 systemd[1]: Failed to start Kubernetes Kubelet. Mar 18 16:42:18.467278 ip-10-0-130-255 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Mar 18 16:42:18.467298 ip-10-0-130-255 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 40f5f338543f4d3ab472f76e542e457d -- Mar 18 16:44:37.303373 ip-10-0-130-255 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:44:37.737447 ip-10-0-130-255 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:37.737447 ip-10-0-130-255 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:44:37.737447 ip-10-0-130-255 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:37.737447 ip-10-0-130-255 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:44:37.737447 ip-10-0-130-255 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:44:37.740145 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.740038 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:44:37.744645 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744621 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:37.744645 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744642 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:37.744645 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744646 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:37.744645 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744649 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744653 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744657 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744660 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744662 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744665 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744668 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744670 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744673 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744676 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744678 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744681 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744683 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744688 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744692 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744695 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744698 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744700 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744703 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744706 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:37.744825 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744709 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744714 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744717 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744720 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744722 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744724 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744727 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744729 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744732 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744734 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744737 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744739 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744742 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744744 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744747 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744750 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744752 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744755 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744757 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744760 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:37.745322 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744762 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744765 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744767 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744770 2578 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744772 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744776 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744779 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744781 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744784 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744786 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744789 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744791 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744796 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744800 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744804 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744807 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744811 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744814 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744817 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744820 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:37.745803 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744822 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744825 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744827 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744830 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744833 2578 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744836 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744839 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744842 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744845 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744847 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744850 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744852 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744855 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744860 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744863 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744866 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744869 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744872 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744875 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744877 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:37.746297 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744880 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744882 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.744884 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745791 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745798 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745802 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745806 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745811 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745814 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745817 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745820 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745823 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745826 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745829 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745832 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745834 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745837 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745839 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745843 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:37.746775 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745845 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745848 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745851 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745854 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745857 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745859 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745862 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745864 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745867 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745870 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745873 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745875 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745877 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745880 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745882 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745885 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745888 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745891 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745893 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745896 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:37.747242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745898 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745901 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745903 2578 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745906 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745908 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745911 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745914 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745916 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745918 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745921 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745923 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745926 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745928 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745931 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745933 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745935 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745938 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745940 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745943 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745945 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:37.747728 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745947 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745950 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745953 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745955 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745957 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745960 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745962 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745965 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745967 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745970 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745972 2578 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745975 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745977 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745980 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745982 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745984 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745987 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745989 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745993 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745995 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.745998 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:37.748242 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746000 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746002 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746006 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746010 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746013 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746015 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746018 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746020 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746023 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746127 2578 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746138 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746148 2578 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746154 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746159 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746162 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746166 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746171 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746175 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746178 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746181 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746185 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:44:37.748751 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746188 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746191 2578 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746193 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746196 2578 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746199 2578 flags.go:64] FLAG: --cloud-config="" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746201 2578 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746204 2578 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746208 2578 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746211 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746214 2578 flags.go:64] FLAG: --config-dir="" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746217 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746220 2578 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746224 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746227 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746229 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746232 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746235 2578 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746238 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746240 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746243 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746246 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746256 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746259 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746262 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746265 2578 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:44:37.749271 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746269 2578 flags.go:64] FLAG: --enable-server="true" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746271 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746277 2578 flags.go:64] FLAG: --event-burst="100" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746280 2578 flags.go:64] FLAG: --event-qps="50" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746282 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746285 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746288 2578 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746292 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746295 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746298 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746301 2578 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746304 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746307 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746310 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746313 2578 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746316 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746319 2578 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746322 2578 flags.go:64] FLAG: --feature-gates="" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746326 2578 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746328 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746331 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746335 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746337 2578 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746340 2578 flags.go:64] FLAG: --help="false" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746343 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-130-255.ec2.internal" Mar 18 16:44:37.749894 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746346 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746349 2578 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746352 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746357 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746360 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746363 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746366 2578 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746368 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746371 2578 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746374 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746377 2578 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746380 2578 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746383 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746386 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746389 2578 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746392 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746395 2578 flags.go:64] FLAG: --lock-file="" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746397 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746400 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746403 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746408 2578 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746411 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746414 2578 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746417 2578 flags.go:64] FLAG: --logging-format="text" Mar 18 16:44:37.750511 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746419 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746423 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746425 2578 flags.go:64] FLAG: --manifest-url="" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746428 2578 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746432 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746436 2578 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746440 2578 flags.go:64] FLAG: --max-pods="110" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746443 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746446 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746449 2578 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746452 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746456 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746459 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746462 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746469 2578 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746472 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746475 2578 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746478 2578 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746481 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746487 2578 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746490 2578 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746494 2578 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746496 2578 flags.go:64] FLAG: --port="10250" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746499 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:44:37.751134 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746502 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ad53dbc7d616f5ee" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746505 2578 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746508 2578 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746511 2578 flags.go:64] FLAG: --register-node="true" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746513 2578 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746516 2578 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746519 2578 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746522 2578 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746525 2578 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746527 2578 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746531 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746534 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746537 2578 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746539 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746545 2578 flags.go:64] FLAG: --runonce="false" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746547 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746550 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746553 2578 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746556 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746560 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746563 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746566 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746569 2578 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746572 2578 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746574 2578 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746577 2578 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:44:37.751712 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746581 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746584 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746587 2578 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746590 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746595 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746598 2578 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746600 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746604 2578 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746607 2578 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746610 2578 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746613 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746615 2578 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746619 2578 flags.go:64] FLAG: --v="2" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746623 2578 flags.go:64] FLAG: --version="false" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746627 2578 flags.go:64] FLAG: --vmodule="" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746631 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.746634 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746731 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746736 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746739 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746742 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746745 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746748 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:37.752393 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746750 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746754 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746759 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746761 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746764 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746766 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746769 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746771 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746774 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746776 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746780 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746783 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746786 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746788 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746791 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746793 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746796 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746798 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746802 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:37.753236 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746806 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746808 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746811 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746814 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746817 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746819 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746822 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746825 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746828 2578 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746830 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746834 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746836 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746839 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746841 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746844 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746847 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746849 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746852 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746854 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746857 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:37.754062 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746859 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746862 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746865 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746867 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746870 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746873 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746875 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746877 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746880 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746883 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746885 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746887 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746890 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746892 2578 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746895 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746897 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746900 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746902 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746905 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746907 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:37.754936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746910 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746912 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746916 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746918 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746921 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746923 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746926 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746928 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746931 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746933 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746936 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746938 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746941 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746944 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746946 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746949 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746951 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746954 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746957 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746959 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:37.755823 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.746961 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.747504 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.754561 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.754581 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754653 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754662 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754667 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754673 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754678 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754683 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754687 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754692 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754696 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754702 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754707 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754711 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:37.756465 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754715 2578 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754719 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754723 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754727 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754731 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754735 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754742 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754748 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754753 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754757 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754761 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754764 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754769 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754774 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754780 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754786 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754791 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754796 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754800 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:37.756936 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754805 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754810 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754814 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754819 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754825 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754830 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754834 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754839 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754843 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754847 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754851 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754855 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754859 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754863 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754867 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754871 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754875 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754880 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754884 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754888 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:37.757704 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754892 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754895 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754900 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754904 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754908 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754912 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754916 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754921 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754925 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754930 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754934 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754938 2578 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754943 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754947 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754952 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754956 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754960 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754964 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754969 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754973 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:37.758209 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754977 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754982 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754986 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754990 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754994 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.754998 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755002 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755005 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755009 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755013 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755017 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755021 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755025 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755029 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755033 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.755041 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:37.758898 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755227 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755236 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755241 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755245 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755251 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755255 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755259 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755263 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755268 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755272 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755276 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755280 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755284 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755288 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755292 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755296 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755299 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755304 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755308 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755312 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:44:37.759570 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755317 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755321 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755326 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755330 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755335 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755339 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755343 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755347 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755351 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755356 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755360 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755364 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755368 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755372 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755376 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755381 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755385 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755390 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755394 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755398 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:44:37.760206 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755402 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755407 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755411 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755415 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755419 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755423 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755427 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755431 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755435 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755439 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755443 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755447 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755451 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755455 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755459 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755463 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755467 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755471 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755475 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755480 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:44:37.760707 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755484 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755488 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755492 2578 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755496 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755502 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755509 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755514 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755519 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755524 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755528 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755533 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755538 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755543 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755547 2578 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755551 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755555 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755559 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755564 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755568 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:44:37.761258 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755572 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755576 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755580 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755584 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755588 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755592 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:37.755596 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.755604 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.756441 2578 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.759986 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.760862 2578 server.go:1019] "Starting client certificate rotation" Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.760960 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:37.761749 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.760994 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:44:37.784809 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.784782 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:37.788690 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.788669 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:44:37.804129 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.804109 2578 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:44:37.810936 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.810915 2578 log.go:25] "Validated CRI v1 image API" Mar 18 16:44:37.812153 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.812129 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:44:37.812230 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.812170 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:37.814756 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.814733 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c3d50786-f8d2-463a-8b1d-6c6f752a4bfa:/dev/nvme0n1p3 dd71dbdd-bdbd-4777-9a94-31fabb524880:/dev/nvme0n1p4] Mar 18 16:44:37.814826 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.814754 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:44:37.820318 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.820200 2578 manager.go:217] Machine: {Timestamp:2026-03-18 16:44:37.818494222 +0000 UTC m=+0.403268239 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099750 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec236357f60fd2d385bf66bef36b8b2d SystemUUID:ec236357-f60f-d2d3-85bf-66bef36b8b2d BootID:40f5f338-543f-4d3a-b472-f76e542e457d Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1f:d5:31:69:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1f:d5:31:69:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:e2:10:7b:a0:1f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:44:37.820318 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.820307 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:44:37.820481 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.820419 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:44:37.821889 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.821860 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:44:37.822069 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.821891 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-255.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:44:37.822165 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.822085 2578 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:44:37.822165 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.822117 2578 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:44:37.822165 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.822137 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:37.822842 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.822824 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:44:37.823669 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.823658 2578 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:37.823794 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.823783 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:44:37.825692 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.825681 2578 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:44:37.825755 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.825703 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:44:37.825755 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.825719 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:44:37.825755 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.825732 2578 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:44:37.825755 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.825744 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:44:37.826743 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.826731 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:37.826804 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.826758 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:44:37.830738 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.830718 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:44:37.831684 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.831662 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vq24j" Mar 18 16:44:37.832877 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.832864 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:44:37.835937 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.835923 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:44:37.835937 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.835939 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:44:37.836044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.835946 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:44:37.836044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.835952 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:44:37.836044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.835957 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:44:37.836044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.835965 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:44:37.836044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.835973 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:44:37.836044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.835981 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:44:37.836044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.835992 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:44:37.836044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.836001 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:44:37.836044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.836010 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:44:37.836044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.836027 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:44:37.836354 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:37.836223 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 18 16:44:37.836354 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:37.836244 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-255.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 18 16:44:37.836918 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.836907 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:44:37.836948 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.836919 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:44:37.839127 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.839112 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-255.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 16:44:37.839668 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.839653 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vq24j" Mar 18 16:44:37.841334 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.841321 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:44:37.841379 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.841363 2578 server.go:1295] "Started kubelet" Mar 18 16:44:37.841479 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.841455 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:44:37.841581 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.841527 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:44:37.841623 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.841611 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:44:37.842538 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.842525 2578 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:44:37.842716 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.842702 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:44:37.842771 ip-10-0-130-255 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:44:37.848591 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.848573 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:37.849052 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.849039 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:44:37.849939 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:37.849919 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:37.850789 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.850768 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:44:37.850789 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.850791 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:44:37.850974 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.850958 2578 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:44:37.850974 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.850971 2578 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:44:37.851158 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.851051 2578 factory.go:153] Registering CRI-O factory Mar 18 16:44:37.851158 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.851066 2578 factory.go:223] Registration of the crio container factory successfully Mar 18 16:44:37.851158 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.851151 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:44:37.851299 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.851161 2578 factory.go:55] Registering systemd factory Mar 18 16:44:37.851299 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.851171 2578 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:44:37.851299 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.851194 2578 factory.go:103] Registering Raw factory Mar 18 16:44:37.851299 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.851208 2578 manager.go:1196] Started watching for new ooms in manager Mar 18 16:44:37.851935 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.851917 2578 manager.go:319] Starting recovery of all containers Mar 18 16:44:37.853571 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:37.853429 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:44:37.859607 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.854068 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:37.859607 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.854999 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:44:37.859607 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:37.858271 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-255.ec2.internal\" not found" node="ip-10-0-130-255.ec2.internal" Mar 18 16:44:37.863488 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.863336 2578 manager.go:324] Recovery completed Mar 18 16:44:37.864747 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:37.864726 2578 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Mar 18 16:44:37.867669 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.867658 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:37.869889 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.869873 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:37.869952 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.869901 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:37.869952 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.869912 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:37.870408 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.870396 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:44:37.870477 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.870409 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:44:37.870477 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.870428 2578 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:44:37.871527 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.871514 2578 policy_none.go:49] "None policy: Start" Mar 18 16:44:37.871592 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.871532 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:44:37.871592 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.871546 2578 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:44:37.911453 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.911436 2578 manager.go:341] "Starting Device Plugin manager" Mar 18 16:44:37.911573 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:37.911475 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:44:37.911573 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.911528 2578 server.go:85] "Starting device plugin registration server" Mar 18 16:44:37.911796 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.911783 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:44:37.911875 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.911799 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:44:37.911946 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.911929 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:44:37.912061 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.912007 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:44:37.912061 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.912027 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:44:37.912914 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:37.912885 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:44:37.929542 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:37.912935 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:37.990954 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.990865 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:44:37.992288 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.992271 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:44:37.992367 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.992299 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:44:37.992367 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.992319 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:44:37.992367 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.992326 2578 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:44:37.992506 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:37.992369 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:44:37.994838 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:37.994814 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:38.012906 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.012876 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:38.013836 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.013819 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:38.013910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.013854 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:38.013910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.013866 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:38.013910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.013897 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.026471 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.026453 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.026520 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.026477 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-255.ec2.internal\": node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.052323 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.052299 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.093370 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.093313 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal"] Mar 18 16:44:38.093451 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.093422 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:38.094399 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.094384 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:38.094478 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.094414 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:38.094478 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.094430 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:38.095728 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.095716 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:38.095881 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.095867 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.095928 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.095894 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:38.096499 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.096481 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:38.096562 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.096498 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:38.096562 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.096513 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:38.096562 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.096521 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:38.096562 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.096524 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:38.096562 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.096531 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:38.097852 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.097834 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.097911 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.097869 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:44:38.098631 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.098615 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:44:38.098691 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.098647 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:44:38.098691 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.098656 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:44:38.125926 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.125900 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-255.ec2.internal\" not found" node="ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.130453 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.130435 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-255.ec2.internal\" not found" node="ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.152596 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.152567 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.253365 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.253299 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.253483 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.253369 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3951036463533dad8f432e9afffc69c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"3951036463533dad8f432e9afffc69c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.253483 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.253402 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3951036463533dad8f432e9afffc69c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"3951036463533dad8f432e9afffc69c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.253483 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.253422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/576a374e63f549f919ff7ffbf00e5e30-config\") pod \"kube-apiserver-proxy-ip-10-0-130-255.ec2.internal\" (UID: \"576a374e63f549f919ff7ffbf00e5e30\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.354166 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.354134 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.354311 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.354184 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3951036463533dad8f432e9afffc69c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"3951036463533dad8f432e9afffc69c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.354311 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.354217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/576a374e63f549f919ff7ffbf00e5e30-config\") pod \"kube-apiserver-proxy-ip-10-0-130-255.ec2.internal\" (UID: \"576a374e63f549f919ff7ffbf00e5e30\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.354311 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.354237 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3951036463533dad8f432e9afffc69c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"3951036463533dad8f432e9afffc69c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.354311 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.354280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3951036463533dad8f432e9afffc69c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"3951036463533dad8f432e9afffc69c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.354311 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.354294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3951036463533dad8f432e9afffc69c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal\" (UID: \"3951036463533dad8f432e9afffc69c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.354311 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.354283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/576a374e63f549f919ff7ffbf00e5e30-config\") pod \"kube-apiserver-proxy-ip-10-0-130-255.ec2.internal\" (UID: \"576a374e63f549f919ff7ffbf00e5e30\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.429446 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.429406 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.432940 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.432918 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Mar 18 16:44:38.455246 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.455214 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.555622 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.555597 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.655979 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.655949 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.756433 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.756402 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.760571 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.760549 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:44:38.760737 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.760713 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:38.760788 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.760713 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:44:38.841882 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.841800 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:39:37 +0000 UTC" deadline="2027-08-15 22:49:53.944664809 +0000 UTC" Mar 18 16:44:38.841882 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.841847 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12366h5m15.102835728s" Mar 18 16:44:38.848973 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.848953 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:44:38.856469 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.856446 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.860125 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.860087 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:44:38.893553 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.893526 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-778x8" Mar 18 16:44:38.902287 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.902269 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-778x8" Mar 18 16:44:38.956541 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:38.956498 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-255.ec2.internal\" not found" Mar 18 16:44:38.976834 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:38.976796 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3951036463533dad8f432e9afffc69c0.slice/crio-151e0e0c0058ce667163c4657cd33943a0f2c00410a628749655a5abc0415828 WatchSource:0}: Error finding container 151e0e0c0058ce667163c4657cd33943a0f2c00410a628749655a5abc0415828: Status 404 returned error can't find the container with id 151e0e0c0058ce667163c4657cd33943a0f2c00410a628749655a5abc0415828 Mar 18 16:44:38.977223 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:38.977197 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod576a374e63f549f919ff7ffbf00e5e30.slice/crio-21ed71d4d3af80052045387c2eb8d69c5aa2f0a58b6133c7762b00ee98092b5b WatchSource:0}: Error finding container 21ed71d4d3af80052045387c2eb8d69c5aa2f0a58b6133c7762b00ee98092b5b: Status 404 returned error can't find the container with id 21ed71d4d3af80052045387c2eb8d69c5aa2f0a58b6133c7762b00ee98092b5b Mar 18 16:44:38.982303 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.982279 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:44:38.996348 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.996301 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" event={"ID":"576a374e63f549f919ff7ffbf00e5e30","Type":"ContainerStarted","Data":"21ed71d4d3af80052045387c2eb8d69c5aa2f0a58b6133c7762b00ee98092b5b"} Mar 18 16:44:38.997358 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:38.997340 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" event={"ID":"3951036463533dad8f432e9afffc69c0","Type":"ContainerStarted","Data":"151e0e0c0058ce667163c4657cd33943a0f2c00410a628749655a5abc0415828"} Mar 18 16:44:39.027431 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.027412 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:39.050037 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.050011 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" Mar 18 16:44:39.062346 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.062328 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:39.064309 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.064294 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" Mar 18 16:44:39.072430 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.072415 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:44:39.286109 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.286075 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:39.827729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.827695 2578 apiserver.go:52] "Watching apiserver" Mar 18 16:44:39.832759 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.832732 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:44:39.834247 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.834220 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl","openshift-cluster-node-tuning-operator/tuned-j8fzd","openshift-dns/node-resolver-tvxnf","openshift-image-registry/node-ca-x499v","openshift-multus/multus-64cg4","openshift-multus/multus-additional-cni-plugins-d78r7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal","openshift-multus/network-metrics-daemon-jcvjp","openshift-network-diagnostics/network-check-target-b5628","openshift-network-operator/iptables-alerter-shkth","openshift-ovn-kubernetes/ovnkube-node-hjd4v","kube-system/konnectivity-agent-ttdbk","kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal"] Mar 18 16:44:39.836651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.836629 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:39.837830 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.837797 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.838618 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.838602 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:39.838830 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.838815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-mcmfr\"" Mar 18 16:44:39.839016 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.838998 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:39.839172 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.839004 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:39.839329 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.839310 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:44:39.840017 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.839997 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:44:39.840118 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.840077 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:39.840180 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.839999 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-srxh7\"" Mar 18 16:44:39.840249 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.840222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.840445 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.840427 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:44:39.840870 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.840852 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:44:39.841040 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.841025 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-67h2l\"" Mar 18 16:44:39.841231 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.841212 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:44:39.841644 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.841625 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:44:39.841855 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.841838 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:44:39.842221 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.842204 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pxc2z\"" Mar 18 16:44:39.842350 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.842334 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:44:39.842435 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.842421 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bc7nv\"" Mar 18 16:44:39.842555 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.842538 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:44:39.842767 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.842752 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:44:39.842986 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.842962 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.843355 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.843336 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:44:39.843433 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.843379 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:44:39.844169 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.844152 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:39.844253 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:39.844220 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:44:39.844693 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.844675 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:44:39.844971 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.844950 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:44:39.845216 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.845200 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dzv4d\"" Mar 18 16:44:39.845596 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.845580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:39.845663 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:39.845641 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:39.845743 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.845684 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.846895 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.846879 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.847499 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.847477 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:44:39.847581 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.847535 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-czlpq\"" Mar 18 16:44:39.847581 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.847553 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:44:39.847659 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.847489 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:44:39.848372 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.848348 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:44:39.848693 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.848676 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:44:39.849540 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.849518 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hxg5b\"" Mar 18 16:44:39.849895 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.849833 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:44:39.850122 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.850083 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:44:39.850420 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.850357 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:44:39.850789 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.850772 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:44:39.850861 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.850819 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gt48b\"" Mar 18 16:44:39.850910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.850885 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:44:39.852269 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.851134 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:44:39.852269 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.851169 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:44:39.856005 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.855986 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:44:39.861410 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861389 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/245b5eb7-bf78-4c89-8b17-d9e75f2f63d8-konnectivity-ca\") pod \"konnectivity-agent-ttdbk\" (UID: \"245b5eb7-bf78-4c89-8b17-d9e75f2f63d8\") " pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:44:39.861502 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861422 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/70516603-846e-4c1f-80db-9b409ecc2c96-host-slash\") pod \"iptables-alerter-shkth\" (UID: \"70516603-846e-4c1f-80db-9b409ecc2c96\") " pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:39.861502 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861439 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-var-lib-kubelet\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.861502 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861462 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-hostroot\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.861502 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861483 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-run-systemd\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.861695 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-sysconfig\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.861695 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861572 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-run-ovn\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.861695 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-system-cni-dir\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.861695 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64t2x\" (UniqueName: \"kubernetes.io/projected/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-kube-api-access-64t2x\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:39.861856 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861696 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1ec73fb-e596-4c69-abc5-b3073ed73133-env-overrides\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.861856 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861720 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdnwf\" (UniqueName: \"kubernetes.io/projected/d1ec73fb-e596-4c69-abc5-b3073ed73133-kube-api-access-rdnwf\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.861856 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-kubernetes\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.861856 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.861856 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-tuned\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.861856 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861821 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e48c47f8-3be3-4bee-bab5-5a2d007486f8-hosts-file\") pod \"node-resolver-tvxnf\" (UID: \"e48c47f8-3be3-4bee-bab5-5a2d007486f8\") " pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:39.861856 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861848 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1ec73fb-e596-4c69-abc5-b3073ed73133-ovnkube-config\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-systemd\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861887 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-system-cni-dir\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861917 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861955 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.861991 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-run-k8s-cni-cncf-io\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-log-socket\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862044 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1ec73fb-e596-4c69-abc5-b3073ed73133-ovn-node-metrics-cert\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862087 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e48c47f8-3be3-4bee-bab5-5a2d007486f8-tmp-dir\") pod \"node-resolver-tvxnf\" (UID: \"e48c47f8-3be3-4bee-bab5-5a2d007486f8\") " pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862134 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-os-release\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.862182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862184 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-device-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-cni-binary-copy\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862264 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-etc-selinux\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-cni-dir\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862321 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/70516603-846e-4c1f-80db-9b409ecc2c96-iptables-alerter-script\") pod \"iptables-alerter-shkth\" (UID: \"70516603-846e-4c1f-80db-9b409ecc2c96\") " pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862344 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbb42fee-4a86-4fbc-b701-d582b093b57a-host\") pod \"node-ca-x499v\" (UID: \"bbb42fee-4a86-4fbc-b701-d582b093b57a\") " pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-var-lib-cni-multus\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862420 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-kubelet\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862464 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-node-log\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862503 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-run-ovn-kubernetes\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862529 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-sys\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862553 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-daemon-config\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862600 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-run-openvswitch\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862624 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1ec73fb-e596-4c69-abc5-b3073ed73133-ovnkube-script-lib\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.862651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862646 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-run\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-var-lib-kubelet\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862695 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-tmp\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jbds\" (UniqueName: \"kubernetes.io/projected/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-kube-api-access-7jbds\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-cnibin\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-socket-dir-parent\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862786 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-run-multus-certs\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-etc-openvswitch\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862834 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-cni-bin\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-sysctl-d\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-registration-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc27bc09-bb46-4e2c-878a-fbd2388a8177-cni-binary-copy\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862973 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzkg4\" (UniqueName: \"kubernetes.io/projected/cc27bc09-bb46-4e2c-878a-fbd2388a8177-kube-api-access-zzkg4\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.862998 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-modprobe-d\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-host\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-socket-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.863219 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/245b5eb7-bf78-4c89-8b17-d9e75f2f63d8-agent-certs\") pod \"konnectivity-agent-ttdbk\" (UID: \"245b5eb7-bf78-4c89-8b17-d9e75f2f63d8\") " pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863134 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bxhs\" (UniqueName: \"kubernetes.io/projected/70516603-846e-4c1f-80db-9b409ecc2c96-kube-api-access-9bxhs\") pod \"iptables-alerter-shkth\" (UID: \"70516603-846e-4c1f-80db-9b409ecc2c96\") " pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863152 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-os-release\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863165 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-run-netns\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-conf-dir\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863195 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-etc-kubernetes\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863208 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-cni-netd\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863221 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-lib-modules\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863234 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpmfb\" (UniqueName: \"kubernetes.io/projected/bbb42fee-4a86-4fbc-b701-d582b093b57a-kube-api-access-qpmfb\") pod \"node-ca-x499v\" (UID: \"bbb42fee-4a86-4fbc-b701-d582b093b57a\") " pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-sysctl-conf\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863263 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-sys-fs\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snctt\" (UniqueName: \"kubernetes.io/projected/f41d5653-51e9-4c4c-9238-8623342b9fb5-kube-api-access-snctt\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863299 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbb42fee-4a86-4fbc-b701-d582b093b57a-serviceca\") pod \"node-ca-x499v\" (UID: \"bbb42fee-4a86-4fbc-b701-d582b093b57a\") " pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863318 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6czkt\" (UniqueName: \"kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt\") pod \"network-check-target-b5628\" (UID: \"4e172408-4d26-4b03-a0eb-bfcb801cdadc\") " pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863345 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863359 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-cnibin\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.863649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863372 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-var-lib-cni-bin\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.864220 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863386 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-slash\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.864220 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-var-lib-openvswitch\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.864220 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863413 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kjzb\" (UniqueName: \"kubernetes.io/projected/e48c47f8-3be3-4bee-bab5-5a2d007486f8-kube-api-access-6kjzb\") pod \"node-resolver-tvxnf\" (UID: \"e48c47f8-3be3-4bee-bab5-5a2d007486f8\") " pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:39.864220 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863428 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8vx5\" (UniqueName: \"kubernetes.io/projected/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-kube-api-access-z8vx5\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.864220 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-systemd-units\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.864220 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.863469 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-run-netns\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.903147 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.903116 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:38 +0000 UTC" deadline="2027-08-14 09:29:18.130499494 +0000 UTC" Mar 18 16:44:39.903252 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.903150 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12328h44m38.227355314s" Mar 18 16:44:39.926017 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.925993 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:39.964180 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpmfb\" (UniqueName: \"kubernetes.io/projected/bbb42fee-4a86-4fbc-b701-d582b093b57a-kube-api-access-qpmfb\") pod \"node-ca-x499v\" (UID: \"bbb42fee-4a86-4fbc-b701-d582b093b57a\") " pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:39.964380 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964184 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-sysctl-conf\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.964450 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-sys-fs\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.964450 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964429 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snctt\" (UniqueName: \"kubernetes.io/projected/f41d5653-51e9-4c4c-9238-8623342b9fb5-kube-api-access-snctt\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.964550 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbb42fee-4a86-4fbc-b701-d582b093b57a-serviceca\") pod \"node-ca-x499v\" (UID: \"bbb42fee-4a86-4fbc-b701-d582b093b57a\") " pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:39.964550 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-sysctl-conf\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.964550 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964475 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:39.964550 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6czkt\" (UniqueName: \"kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt\") pod \"network-check-target-b5628\" (UID: \"4e172408-4d26-4b03-a0eb-bfcb801cdadc\") " pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:39.964550 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964514 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.964550 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964511 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-sys-fs\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.964550 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-cnibin\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.964550 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-var-lib-cni-bin\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-slash\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964574 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-var-lib-openvswitch\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964590 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kjzb\" (UniqueName: \"kubernetes.io/projected/e48c47f8-3be3-4bee-bab5-5a2d007486f8-kube-api-access-6kjzb\") pod \"node-resolver-tvxnf\" (UID: \"e48c47f8-3be3-4bee-bab5-5a2d007486f8\") " pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8vx5\" (UniqueName: \"kubernetes.io/projected/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-kube-api-access-z8vx5\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964619 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-systemd-units\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-run-netns\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964649 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/245b5eb7-bf78-4c89-8b17-d9e75f2f63d8-konnectivity-ca\") pod \"konnectivity-agent-ttdbk\" (UID: \"245b5eb7-bf78-4c89-8b17-d9e75f2f63d8\") " pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/70516603-846e-4c1f-80db-9b409ecc2c96-host-slash\") pod \"iptables-alerter-shkth\" (UID: \"70516603-846e-4c1f-80db-9b409ecc2c96\") " pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964700 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-var-lib-kubelet\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964715 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-hostroot\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-run-systemd\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:39.964735 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964760 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-run-systemd\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/70516603-846e-4c1f-80db-9b409ecc2c96-host-slash\") pod \"iptables-alerter-shkth\" (UID: \"70516603-846e-4c1f-80db-9b409ecc2c96\") " pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:39.964810 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs podName:474e4d0f-dbfc-41a4-ad8f-fcada6a1b880 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.464787767 +0000 UTC m=+3.049561786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs") pod "network-metrics-daemon-jcvjp" (UID: "474e4d0f-dbfc-41a4-ad8f-fcada6a1b880") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964830 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-var-lib-kubelet\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.964910 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964850 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-run-netns\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964857 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-hostroot\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964867 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-systemd-units\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-var-lib-cni-bin\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964909 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-sysconfig\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-var-lib-openvswitch\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964939 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-run-ovn\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964967 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-slash\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-system-cni-dir\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-sysconfig\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965006 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-run-ovn\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64t2x\" (UniqueName: \"kubernetes.io/projected/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-kube-api-access-64t2x\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.964972 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-cnibin\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-system-cni-dir\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1ec73fb-e596-4c69-abc5-b3073ed73133-env-overrides\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965074 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdnwf\" (UniqueName: \"kubernetes.io/projected/d1ec73fb-e596-4c69-abc5-b3073ed73133-kube-api-access-rdnwf\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965201 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-kubernetes\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.965729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965239 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-kubernetes\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-tuned\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965269 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e48c47f8-3be3-4bee-bab5-5a2d007486f8-hosts-file\") pod \"node-resolver-tvxnf\" (UID: \"e48c47f8-3be3-4bee-bab5-5a2d007486f8\") " pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965327 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e48c47f8-3be3-4bee-bab5-5a2d007486f8-hosts-file\") pod \"node-resolver-tvxnf\" (UID: \"e48c47f8-3be3-4bee-bab5-5a2d007486f8\") " pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1ec73fb-e596-4c69-abc5-b3073ed73133-ovnkube-config\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965382 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-systemd\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965450 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-system-cni-dir\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965472 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965500 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/245b5eb7-bf78-4c89-8b17-d9e75f2f63d8-konnectivity-ca\") pod \"konnectivity-agent-ttdbk\" (UID: \"245b5eb7-bf78-4c89-8b17-d9e75f2f63d8\") " pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-run-k8s-cni-cncf-io\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-log-socket\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965544 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-system-cni-dir\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1ec73fb-e596-4c69-abc5-b3073ed73133-ovn-node-metrics-cert\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965534 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e48c47f8-3be3-4bee-bab5-5a2d007486f8-tmp-dir\") pod \"node-resolver-tvxnf\" (UID: \"e48c47f8-3be3-4bee-bab5-5a2d007486f8\") " pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:39.966554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965598 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-systemd\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-os-release\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965619 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-run-k8s-cni-cncf-io\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965639 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965666 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-device-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-cni-binary-copy\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965715 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-etc-selinux\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-cni-dir\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965768 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/70516603-846e-4c1f-80db-9b409ecc2c96-iptables-alerter-script\") pod \"iptables-alerter-shkth\" (UID: \"70516603-846e-4c1f-80db-9b409ecc2c96\") " pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbb42fee-4a86-4fbc-b701-d582b093b57a-host\") pod \"node-ca-x499v\" (UID: \"bbb42fee-4a86-4fbc-b701-d582b093b57a\") " pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-var-lib-cni-multus\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-kubelet\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965861 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-node-log\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-run-ovn-kubernetes\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965911 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-sys\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965918 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1ec73fb-e596-4c69-abc5-b3073ed73133-ovnkube-config\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965936 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-daemon-config\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.967363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e48c47f8-3be3-4bee-bab5-5a2d007486f8-tmp-dir\") pod \"node-resolver-tvxnf\" (UID: \"e48c47f8-3be3-4bee-bab5-5a2d007486f8\") " pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-run-openvswitch\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1ec73fb-e596-4c69-abc5-b3073ed73133-ovnkube-script-lib\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-log-socket\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-run\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-var-lib-kubelet\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966047 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-device-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-tmp\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-os-release\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jbds\" (UniqueName: \"kubernetes.io/projected/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-kube-api-access-7jbds\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966159 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-cnibin\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.965714 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1ec73fb-e596-4c69-abc5-b3073ed73133-env-overrides\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966187 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-socket-dir-parent\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966194 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-run-ovn-kubernetes\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966225 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.968082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966582 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-run-multus-certs\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-socket-dir-parent\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-etc-openvswitch\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-run-multus-certs\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966736 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-cni-binary-copy\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966905 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-etc-openvswitch\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.966968 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-cni-dir\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-cni-bin\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967059 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bbb42fee-4a86-4fbc-b701-d582b093b57a-serviceca\") pod \"node-ca-x499v\" (UID: \"bbb42fee-4a86-4fbc-b701-d582b093b57a\") " pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-sysctl-d\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-registration-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-var-lib-cni-multus\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc27bc09-bb46-4e2c-878a-fbd2388a8177-cni-binary-copy\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967290 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-sysctl-d\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzkg4\" (UniqueName: \"kubernetes.io/projected/cc27bc09-bb46-4e2c-878a-fbd2388a8177-kube-api-access-zzkg4\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-run\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967348 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-modprobe-d\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967551 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-node-log\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.968717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-kubelet\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-registration-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-cni-bin\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967810 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-cnibin\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.967960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-etc-selinux\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968187 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-daemon-config\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-var-lib-kubelet\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968263 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-sys\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968224 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc27bc09-bb46-4e2c-878a-fbd2388a8177-cni-binary-copy\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968364 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-modprobe-d\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-host\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968431 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbb42fee-4a86-4fbc-b701-d582b093b57a-host\") pod \"node-ca-x499v\" (UID: \"bbb42fee-4a86-4fbc-b701-d582b093b57a\") " pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968462 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-host\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968570 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-socket-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968616 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-run-openvswitch\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/245b5eb7-bf78-4c89-8b17-d9e75f2f63d8-agent-certs\") pod \"konnectivity-agent-ttdbk\" (UID: \"245b5eb7-bf78-4c89-8b17-d9e75f2f63d8\") " pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bxhs\" (UniqueName: \"kubernetes.io/projected/70516603-846e-4c1f-80db-9b409ecc2c96-kube-api-access-9bxhs\") pod \"iptables-alerter-shkth\" (UID: \"70516603-846e-4c1f-80db-9b409ecc2c96\") " pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-os-release\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.969401 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968771 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-run-netns\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-conf-dir\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-etc-kubernetes\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.968872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-cni-netd\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969014 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/70516603-846e-4c1f-80db-9b409ecc2c96-iptables-alerter-script\") pod \"iptables-alerter-shkth\" (UID: \"70516603-846e-4c1f-80db-9b409ecc2c96\") " pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-etc-tuned\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-host-run-netns\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969285 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-os-release\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969311 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-etc-kubernetes\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969322 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc27bc09-bb46-4e2c-878a-fbd2388a8177-multus-conf-dir\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-lib-modules\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969417 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1ec73fb-e596-4c69-abc5-b3073ed73133-host-cni-netd\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f41d5653-51e9-4c4c-9238-8623342b9fb5-socket-dir\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969540 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1ec73fb-e596-4c69-abc5-b3073ed73133-ovnkube-script-lib\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969908 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-lib-modules\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.970174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.969997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-tmp\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:39.971460 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.971436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1ec73fb-e596-4c69-abc5-b3073ed73133-ovn-node-metrics-cert\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.973087 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.973066 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/245b5eb7-bf78-4c89-8b17-d9e75f2f63d8-agent-certs\") pod \"konnectivity-agent-ttdbk\" (UID: \"245b5eb7-bf78-4c89-8b17-d9e75f2f63d8\") " pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:44:39.982426 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:39.982393 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:39.982426 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:39.982418 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:39.982579 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:39.982432 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6czkt for pod openshift-network-diagnostics/network-check-target-b5628: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:39.982579 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:39.982498 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt podName:4e172408-4d26-4b03-a0eb-bfcb801cdadc nodeName:}" failed. No retries permitted until 2026-03-18 16:44:40.482482896 +0000 UTC m=+3.067256899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6czkt" (UniqueName: "kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt") pod "network-check-target-b5628" (UID: "4e172408-4d26-4b03-a0eb-bfcb801cdadc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:39.984649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.984624 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snctt\" (UniqueName: \"kubernetes.io/projected/f41d5653-51e9-4c4c-9238-8623342b9fb5-kube-api-access-snctt\") pod \"aws-ebs-csi-driver-node-r9kkl\" (UID: \"f41d5653-51e9-4c4c-9238-8623342b9fb5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:39.987146 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.987123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpmfb\" (UniqueName: \"kubernetes.io/projected/bbb42fee-4a86-4fbc-b701-d582b093b57a-kube-api-access-qpmfb\") pod \"node-ca-x499v\" (UID: \"bbb42fee-4a86-4fbc-b701-d582b093b57a\") " pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:39.993710 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.993687 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8vx5\" (UniqueName: \"kubernetes.io/projected/c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf-kube-api-access-z8vx5\") pod \"multus-additional-cni-plugins-d78r7\" (UID: \"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf\") " pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:39.994319 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.994295 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzkg4\" (UniqueName: \"kubernetes.io/projected/cc27bc09-bb46-4e2c-878a-fbd2388a8177-kube-api-access-zzkg4\") pod \"multus-64cg4\" (UID: \"cc27bc09-bb46-4e2c-878a-fbd2388a8177\") " pod="openshift-multus/multus-64cg4" Mar 18 16:44:39.994617 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.994589 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdnwf\" (UniqueName: \"kubernetes.io/projected/d1ec73fb-e596-4c69-abc5-b3073ed73133-kube-api-access-rdnwf\") pod \"ovnkube-node-hjd4v\" (UID: \"d1ec73fb-e596-4c69-abc5-b3073ed73133\") " pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:39.995055 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.995034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64t2x\" (UniqueName: \"kubernetes.io/projected/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-kube-api-access-64t2x\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:39.995369 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.995347 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bxhs\" (UniqueName: \"kubernetes.io/projected/70516603-846e-4c1f-80db-9b409ecc2c96-kube-api-access-9bxhs\") pod \"iptables-alerter-shkth\" (UID: \"70516603-846e-4c1f-80db-9b409ecc2c96\") " pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:39.996578 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.996555 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kjzb\" (UniqueName: \"kubernetes.io/projected/e48c47f8-3be3-4bee-bab5-5a2d007486f8-kube-api-access-6kjzb\") pod \"node-resolver-tvxnf\" (UID: \"e48c47f8-3be3-4bee-bab5-5a2d007486f8\") " pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:39.996917 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:39.996896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jbds\" (UniqueName: \"kubernetes.io/projected/a1398da0-e9bd-4d1e-a150-d44bea8e2d78-kube-api-access-7jbds\") pod \"tuned-j8fzd\" (UID: \"a1398da0-e9bd-4d1e-a150-d44bea8e2d78\") " pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:40.149245 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.149215 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-shkth" Mar 18 16:44:40.157035 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.157015 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" Mar 18 16:44:40.168637 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.168616 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:44:40.171182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.171165 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-tvxnf" Mar 18 16:44:40.176534 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.176514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x499v" Mar 18 16:44:40.183044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.183026 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-64cg4" Mar 18 16:44:40.191548 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.191530 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d78r7" Mar 18 16:44:40.197075 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.197057 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" Mar 18 16:44:40.203614 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.203594 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:44:40.210202 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.210183 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:44:40.473310 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.473220 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:40.473452 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:40.473396 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:40.473452 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:40.473446 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs podName:474e4d0f-dbfc-41a4-ad8f-fcada6a1b880 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:41.473431422 +0000 UTC m=+4.058205426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs") pod "network-metrics-daemon-jcvjp" (UID: "474e4d0f-dbfc-41a4-ad8f-fcada6a1b880") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:40.540170 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:40.540135 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc27bc09_bb46_4e2c_878a_fbd2388a8177.slice/crio-6beb06903b5a845e7ba6b2ccf3cfd7d83640e51b36fa4e2bef2883301ad948e3 WatchSource:0}: Error finding container 6beb06903b5a845e7ba6b2ccf3cfd7d83640e51b36fa4e2bef2883301ad948e3: Status 404 returned error can't find the container with id 6beb06903b5a845e7ba6b2ccf3cfd7d83640e51b36fa4e2bef2883301ad948e3 Mar 18 16:44:40.543670 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:40.543634 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc43e0b46_24aa_44cd_bf3f_f8e27b1abbdf.slice/crio-40b595e5c8a2bb78736ce29041b34be80da32ece20cc20ed20436cbb606c071c WatchSource:0}: Error finding container 40b595e5c8a2bb78736ce29041b34be80da32ece20cc20ed20436cbb606c071c: Status 404 returned error can't find the container with id 40b595e5c8a2bb78736ce29041b34be80da32ece20cc20ed20436cbb606c071c Mar 18 16:44:40.544676 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:40.544652 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod245b5eb7_bf78_4c89_8b17_d9e75f2f63d8.slice/crio-50b71350bcf5ce17cc0301bc63bfe0407fa82ad38876d8dbd229b198aecb66ad WatchSource:0}: Error finding container 50b71350bcf5ce17cc0301bc63bfe0407fa82ad38876d8dbd229b198aecb66ad: Status 404 returned error can't find the container with id 50b71350bcf5ce17cc0301bc63bfe0407fa82ad38876d8dbd229b198aecb66ad Mar 18 16:44:40.546583 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:40.546544 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf41d5653_51e9_4c4c_9238_8623342b9fb5.slice/crio-5d8dae64065d1dd957053c81dffe1ba7d35131d2e02a9b3a1dd0b8a754f8b99b WatchSource:0}: Error finding container 5d8dae64065d1dd957053c81dffe1ba7d35131d2e02a9b3a1dd0b8a754f8b99b: Status 404 returned error can't find the container with id 5d8dae64065d1dd957053c81dffe1ba7d35131d2e02a9b3a1dd0b8a754f8b99b Mar 18 16:44:40.548634 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:40.548604 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ec73fb_e596_4c69_abc5_b3073ed73133.slice/crio-0c5757dbd2d5c8f45ef1397429be48286c4c8dd00bd96a84beb68cbefc671857 WatchSource:0}: Error finding container 0c5757dbd2d5c8f45ef1397429be48286c4c8dd00bd96a84beb68cbefc671857: Status 404 returned error can't find the container with id 0c5757dbd2d5c8f45ef1397429be48286c4c8dd00bd96a84beb68cbefc671857 Mar 18 16:44:40.549669 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:40.549648 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode48c47f8_3be3_4bee_bab5_5a2d007486f8.slice/crio-de312811314e0095ce99d507f27b4e36c9eb3e9b0503fc076d4ab4c53bf572da WatchSource:0}: Error finding container de312811314e0095ce99d507f27b4e36c9eb3e9b0503fc076d4ab4c53bf572da: Status 404 returned error can't find the container with id de312811314e0095ce99d507f27b4e36c9eb3e9b0503fc076d4ab4c53bf572da Mar 18 16:44:40.550133 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:40.550108 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70516603_846e_4c1f_80db_9b409ecc2c96.slice/crio-debd469d47283aa86198b08f2b5c4e0f46cb70a3c380386033fa5f2ebf3aa416 WatchSource:0}: Error finding container debd469d47283aa86198b08f2b5c4e0f46cb70a3c380386033fa5f2ebf3aa416: Status 404 returned error can't find the container with id debd469d47283aa86198b08f2b5c4e0f46cb70a3c380386033fa5f2ebf3aa416 Mar 18 16:44:40.551320 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:44:40.551214 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbb42fee_4a86_4fbc_b701_d582b093b57a.slice/crio-76726a4ba881cdc035a2b0a49b869e81b92f3cb2f0789b7728437ac584165bbf WatchSource:0}: Error finding container 76726a4ba881cdc035a2b0a49b869e81b92f3cb2f0789b7728437ac584165bbf: Status 404 returned error can't find the container with id 76726a4ba881cdc035a2b0a49b869e81b92f3cb2f0789b7728437ac584165bbf Mar 18 16:44:40.574187 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.574167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6czkt\" (UniqueName: \"kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt\") pod \"network-check-target-b5628\" (UID: \"4e172408-4d26-4b03-a0eb-bfcb801cdadc\") " pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:40.574300 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:40.574286 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:40.574367 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:40.574310 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:40.574367 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:40.574322 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6czkt for pod openshift-network-diagnostics/network-check-target-b5628: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:40.574477 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:40.574375 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt podName:4e172408-4d26-4b03-a0eb-bfcb801cdadc nodeName:}" failed. No retries permitted until 2026-03-18 16:44:41.574357767 +0000 UTC m=+4.159131777 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6czkt" (UniqueName: "kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt") pod "network-check-target-b5628" (UID: "4e172408-4d26-4b03-a0eb-bfcb801cdadc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:40.904315 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.904141 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:39:38 +0000 UTC" deadline="2027-10-10 04:38:26.567636906 +0000 UTC" Mar 18 16:44:40.904315 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:40.904312 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13691h53m45.66332743s" Mar 18 16:44:41.008436 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.007815 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x499v" event={"ID":"bbb42fee-4a86-4fbc-b701-d582b093b57a","Type":"ContainerStarted","Data":"76726a4ba881cdc035a2b0a49b869e81b92f3cb2f0789b7728437ac584165bbf"} Mar 18 16:44:41.009938 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.009885 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-shkth" event={"ID":"70516603-846e-4c1f-80db-9b409ecc2c96","Type":"ContainerStarted","Data":"debd469d47283aa86198b08f2b5c4e0f46cb70a3c380386033fa5f2ebf3aa416"} Mar 18 16:44:41.013630 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.013585 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" event={"ID":"d1ec73fb-e596-4c69-abc5-b3073ed73133","Type":"ContainerStarted","Data":"0c5757dbd2d5c8f45ef1397429be48286c4c8dd00bd96a84beb68cbefc671857"} Mar 18 16:44:41.015163 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.015113 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" event={"ID":"f41d5653-51e9-4c4c-9238-8623342b9fb5","Type":"ContainerStarted","Data":"5d8dae64065d1dd957053c81dffe1ba7d35131d2e02a9b3a1dd0b8a754f8b99b"} Mar 18 16:44:41.017464 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.017374 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d78r7" event={"ID":"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf","Type":"ContainerStarted","Data":"40b595e5c8a2bb78736ce29041b34be80da32ece20cc20ed20436cbb606c071c"} Mar 18 16:44:41.020639 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.020613 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-64cg4" event={"ID":"cc27bc09-bb46-4e2c-878a-fbd2388a8177","Type":"ContainerStarted","Data":"6beb06903b5a845e7ba6b2ccf3cfd7d83640e51b36fa4e2bef2883301ad948e3"} Mar 18 16:44:41.022367 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.022342 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" event={"ID":"a1398da0-e9bd-4d1e-a150-d44bea8e2d78","Type":"ContainerStarted","Data":"a89398f6e19835ae5cfeb963c8adc754fd5ea194a90e3438f2554371306e03f7"} Mar 18 16:44:41.025311 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.023974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ttdbk" event={"ID":"245b5eb7-bf78-4c89-8b17-d9e75f2f63d8","Type":"ContainerStarted","Data":"50b71350bcf5ce17cc0301bc63bfe0407fa82ad38876d8dbd229b198aecb66ad"} Mar 18 16:44:41.029362 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.029337 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tvxnf" event={"ID":"e48c47f8-3be3-4bee-bab5-5a2d007486f8","Type":"ContainerStarted","Data":"de312811314e0095ce99d507f27b4e36c9eb3e9b0503fc076d4ab4c53bf572da"} Mar 18 16:44:41.032209 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.032186 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" event={"ID":"576a374e63f549f919ff7ffbf00e5e30","Type":"ContainerStarted","Data":"9b0fbd6792b1399a5fb68b049e83bfd8030f12a94f6329d0f979fa3918517602"} Mar 18 16:44:41.043795 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.043728 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-255.ec2.internal" podStartSLOduration=2.043713118 podStartE2EDuration="2.043713118s" podCreationTimestamp="2026-03-18 16:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:41.043688484 +0000 UTC m=+3.628462514" watchObservedRunningTime="2026-03-18 16:44:41.043713118 +0000 UTC m=+3.628487145" Mar 18 16:44:41.481406 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.481371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:41.481570 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:41.481538 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:41.481635 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:41.481595 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs podName:474e4d0f-dbfc-41a4-ad8f-fcada6a1b880 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:43.481577141 +0000 UTC m=+6.066351147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs") pod "network-metrics-daemon-jcvjp" (UID: "474e4d0f-dbfc-41a4-ad8f-fcada6a1b880") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:41.581884 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.581850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6czkt\" (UniqueName: \"kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt\") pod \"network-check-target-b5628\" (UID: \"4e172408-4d26-4b03-a0eb-bfcb801cdadc\") " pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:41.582046 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:41.582028 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:41.582133 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:41.582054 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:41.582133 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:41.582068 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6czkt for pod openshift-network-diagnostics/network-check-target-b5628: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:41.582229 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:41.582145 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt podName:4e172408-4d26-4b03-a0eb-bfcb801cdadc nodeName:}" failed. No retries permitted until 2026-03-18 16:44:43.58212493 +0000 UTC m=+6.166898948 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6czkt" (UniqueName: "kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt") pod "network-check-target-b5628" (UID: "4e172408-4d26-4b03-a0eb-bfcb801cdadc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:41.994788 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.994271 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:41.994788 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:41.994332 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:41.994788 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:41.994434 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:41.994788 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:41.994580 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:44:42.063161 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:42.061998 2578 generic.go:358] "Generic (PLEG): container finished" podID="3951036463533dad8f432e9afffc69c0" containerID="1bb706ba9da552400640948608dad15b48e7bee6bd7a9bb78e8674f9fe856555" exitCode=0 Mar 18 16:44:42.063161 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:42.062926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" event={"ID":"3951036463533dad8f432e9afffc69c0","Type":"ContainerDied","Data":"1bb706ba9da552400640948608dad15b48e7bee6bd7a9bb78e8674f9fe856555"} Mar 18 16:44:43.071164 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:43.071074 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" event={"ID":"3951036463533dad8f432e9afffc69c0","Type":"ContainerStarted","Data":"430a8535b83d558ae7b6f2f8bd420faaf013d9e8fd53b43341fdfb311dee26d4"} Mar 18 16:44:43.085748 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:43.085700 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-255.ec2.internal" podStartSLOduration=4.085682923 podStartE2EDuration="4.085682923s" podCreationTimestamp="2026-03-18 16:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:43.085608002 +0000 UTC m=+5.670382029" watchObservedRunningTime="2026-03-18 16:44:43.085682923 +0000 UTC m=+5.670456952" Mar 18 16:44:43.498015 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:43.497976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:43.498224 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:43.498182 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:43.498287 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:43.498246 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs podName:474e4d0f-dbfc-41a4-ad8f-fcada6a1b880 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:47.498228149 +0000 UTC m=+10.083002157 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs") pod "network-metrics-daemon-jcvjp" (UID: "474e4d0f-dbfc-41a4-ad8f-fcada6a1b880") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:43.599300 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:43.599264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6czkt\" (UniqueName: \"kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt\") pod \"network-check-target-b5628\" (UID: \"4e172408-4d26-4b03-a0eb-bfcb801cdadc\") " pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:43.599471 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:43.599437 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:43.599471 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:43.599456 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:43.599471 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:43.599469 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6czkt for pod openshift-network-diagnostics/network-check-target-b5628: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:43.599682 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:43.599530 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt podName:4e172408-4d26-4b03-a0eb-bfcb801cdadc nodeName:}" failed. No retries permitted until 2026-03-18 16:44:47.599510668 +0000 UTC m=+10.184284678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6czkt" (UniqueName: "kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt") pod "network-check-target-b5628" (UID: "4e172408-4d26-4b03-a0eb-bfcb801cdadc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:43.993884 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:43.993324 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:43.993884 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:43.993469 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:44:43.999433 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:43.994285 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:43.999433 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:43.994676 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:45.994968 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:45.994935 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:45.995428 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:45.995075 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:44:45.995485 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:45.995449 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:45.995531 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:45.995509 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:47.536344 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:47.536301 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:47.536884 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:47.536428 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:47.536884 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:47.536474 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs podName:474e4d0f-dbfc-41a4-ad8f-fcada6a1b880 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:55.536461164 +0000 UTC m=+18.121235173 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs") pod "network-metrics-daemon-jcvjp" (UID: "474e4d0f-dbfc-41a4-ad8f-fcada6a1b880") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:47.637909 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:47.637377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6czkt\" (UniqueName: \"kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt\") pod \"network-check-target-b5628\" (UID: \"4e172408-4d26-4b03-a0eb-bfcb801cdadc\") " pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:47.637909 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:47.637507 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:47.637909 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:47.637526 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:47.637909 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:47.637537 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6czkt for pod openshift-network-diagnostics/network-check-target-b5628: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:47.637909 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:47.637590 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt podName:4e172408-4d26-4b03-a0eb-bfcb801cdadc nodeName:}" failed. No retries permitted until 2026-03-18 16:44:55.63757104 +0000 UTC m=+18.222345066 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6czkt" (UniqueName: "kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt") pod "network-check-target-b5628" (UID: "4e172408-4d26-4b03-a0eb-bfcb801cdadc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:47.994147 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:47.994079 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:47.994308 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:47.994167 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:47.994308 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:47.994283 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:44:47.994439 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:47.994416 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:49.993318 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:49.992564 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:49.993318 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:49.992699 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:44:49.993318 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:49.993138 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:49.993318 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:49.993226 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:51.992633 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:51.992601 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:51.992633 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:51.992633 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:51.993162 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:51.992743 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:44:51.993162 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:51.992820 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:53.993122 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:53.993074 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:53.993520 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:53.993211 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:53.993520 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:53.993278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:53.993520 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:53.993368 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:44:55.590453 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:55.590420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:55.590963 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:55.590547 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:55.590963 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:55.590620 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs podName:474e4d0f-dbfc-41a4-ad8f-fcada6a1b880 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:11.590599627 +0000 UTC m=+34.175373637 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs") pod "network-metrics-daemon-jcvjp" (UID: "474e4d0f-dbfc-41a4-ad8f-fcada6a1b880") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:55.691690 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:55.691659 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6czkt\" (UniqueName: \"kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt\") pod \"network-check-target-b5628\" (UID: \"4e172408-4d26-4b03-a0eb-bfcb801cdadc\") " pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:55.691873 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:55.691794 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:55.691873 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:55.691808 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:55.691873 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:55.691817 2578 projected.go:194] Error preparing data for projected volume kube-api-access-6czkt for pod openshift-network-diagnostics/network-check-target-b5628: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:55.691873 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:55.691874 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt podName:4e172408-4d26-4b03-a0eb-bfcb801cdadc nodeName:}" failed. No retries permitted until 2026-03-18 16:45:11.691856353 +0000 UTC m=+34.276630363 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6czkt" (UniqueName: "kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt") pod "network-check-target-b5628" (UID: "4e172408-4d26-4b03-a0eb-bfcb801cdadc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:55.992666 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:55.992629 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:55.992856 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:55.992753 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:55.992856 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:55.992809 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:55.992970 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:55.992900 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:44:57.993587 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:57.993561 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:57.993879 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:57.993658 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:44:57.993879 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:57.993743 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:57.993879 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:57.993846 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:59.113486 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.113044 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-tvxnf" event={"ID":"e48c47f8-3be3-4bee-bab5-5a2d007486f8","Type":"ContainerStarted","Data":"c71e9dbd727090cc8eef241dc7d37ec4cb4e9a8d268988f8920f579f546ac42e"} Mar 18 16:44:59.114569 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.114541 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x499v" event={"ID":"bbb42fee-4a86-4fbc-b701-d582b093b57a","Type":"ContainerStarted","Data":"52d04811e441c82d180a75b65b62af6d36221d7c58569a27239eacc57f26a01f"} Mar 18 16:44:59.117085 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.117063 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-acl-logging/0.log" Mar 18 16:44:59.117409 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.117387 2578 generic.go:358] "Generic (PLEG): container finished" podID="d1ec73fb-e596-4c69-abc5-b3073ed73133" containerID="bc410bebde824f0b999763c5351e90a8a319e8ebc75ad716b3e755b37385ce61" exitCode=1 Mar 18 16:44:59.117472 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.117418 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" event={"ID":"d1ec73fb-e596-4c69-abc5-b3073ed73133","Type":"ContainerStarted","Data":"f33e2eb144648ddfd44daec5338a08e12c44028e8d3543ea5f92106a1c595b8a"} Mar 18 16:44:59.117472 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.117447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" event={"ID":"d1ec73fb-e596-4c69-abc5-b3073ed73133","Type":"ContainerStarted","Data":"1b1e338b7f13b7a6ef9d6f8624de8a53a766e7132bddcf250e6e0e1894db40b7"} Mar 18 16:44:59.117472 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.117460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" event={"ID":"d1ec73fb-e596-4c69-abc5-b3073ed73133","Type":"ContainerStarted","Data":"6dfdbb4efd758f8f83391ce6bb51106d696adc162b54f4e181671cfe8f57b146"} Mar 18 16:44:59.117609 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.117474 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" event={"ID":"d1ec73fb-e596-4c69-abc5-b3073ed73133","Type":"ContainerStarted","Data":"e60beb3ffd0d8efd99a3b680f780090676a02aa8b93769d63084c873ab21c530"} Mar 18 16:44:59.117609 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.117487 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" event={"ID":"d1ec73fb-e596-4c69-abc5-b3073ed73133","Type":"ContainerDied","Data":"bc410bebde824f0b999763c5351e90a8a319e8ebc75ad716b3e755b37385ce61"} Mar 18 16:44:59.117609 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.117501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" event={"ID":"d1ec73fb-e596-4c69-abc5-b3073ed73133","Type":"ContainerStarted","Data":"54219cfd0b58accc87e0b9a4a22dfc2e665250bc3426cb1437ec4456b220c2d2"} Mar 18 16:44:59.118656 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.118630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" event={"ID":"f41d5653-51e9-4c4c-9238-8623342b9fb5","Type":"ContainerStarted","Data":"f4b9a179b68e671112b3178ec4d88c70e89204fde869c0ac0cdb9138fb2fdde1"} Mar 18 16:44:59.119849 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.119827 2578 generic.go:358] "Generic (PLEG): container finished" podID="c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf" containerID="d7e0c2482c96f51427ce6b04dd6c086a1dc5eb04ac441d23b21538f63fa17220" exitCode=0 Mar 18 16:44:59.119921 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.119885 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d78r7" event={"ID":"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf","Type":"ContainerDied","Data":"d7e0c2482c96f51427ce6b04dd6c086a1dc5eb04ac441d23b21538f63fa17220"} Mar 18 16:44:59.121515 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.121078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-64cg4" event={"ID":"cc27bc09-bb46-4e2c-878a-fbd2388a8177","Type":"ContainerStarted","Data":"3789cb78c0ffac490ac2c6a5cdeca2ec556203d8f1269fd4e11d665b88837fb1"} Mar 18 16:44:59.122335 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.122313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" event={"ID":"a1398da0-e9bd-4d1e-a150-d44bea8e2d78","Type":"ContainerStarted","Data":"71399629caa940dfecbe7988486432060abf81c26a5cbbf00eca8b46e4d7aa0e"} Mar 18 16:44:59.123600 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.123580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ttdbk" event={"ID":"245b5eb7-bf78-4c89-8b17-d9e75f2f63d8","Type":"ContainerStarted","Data":"9088034a851f408cd5d1346acacd563a04bfb3e9d06ad788b885e92d5dd6039f"} Mar 18 16:44:59.133716 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.133681 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-tvxnf" podStartSLOduration=3.567350624 podStartE2EDuration="21.13365058s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:40.552489163 +0000 UTC m=+3.137263172" lastFinishedPulling="2026-03-18 16:44:58.118789118 +0000 UTC m=+20.703563128" observedRunningTime="2026-03-18 16:44:59.133301683 +0000 UTC m=+21.718075709" watchObservedRunningTime="2026-03-18 16:44:59.13365058 +0000 UTC m=+21.718424605" Mar 18 16:44:59.158017 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.157969 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-64cg4" podStartSLOduration=3.4037603819999998 podStartE2EDuration="21.157954981s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:40.542348823 +0000 UTC m=+3.127122827" lastFinishedPulling="2026-03-18 16:44:58.296543422 +0000 UTC m=+20.881317426" observedRunningTime="2026-03-18 16:44:59.157527125 +0000 UTC m=+21.742301150" watchObservedRunningTime="2026-03-18 16:44:59.157954981 +0000 UTC m=+21.742729009" Mar 18 16:44:59.192496 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.192457 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ttdbk" podStartSLOduration=11.984299965 podStartE2EDuration="21.192443449s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:40.547340543 +0000 UTC m=+3.132114547" lastFinishedPulling="2026-03-18 16:44:49.755484026 +0000 UTC m=+12.340258031" observedRunningTime="2026-03-18 16:44:59.192400368 +0000 UTC m=+21.777174394" watchObservedRunningTime="2026-03-18 16:44:59.192443449 +0000 UTC m=+21.777217457" Mar 18 16:44:59.209713 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.209662 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-j8fzd" podStartSLOduration=3.618116735 podStartE2EDuration="21.209647165s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:40.55398598 +0000 UTC m=+3.138759999" lastFinishedPulling="2026-03-18 16:44:58.145516411 +0000 UTC m=+20.730290429" observedRunningTime="2026-03-18 16:44:59.209598783 +0000 UTC m=+21.794372808" watchObservedRunningTime="2026-03-18 16:44:59.209647165 +0000 UTC m=+21.794421402" Mar 18 16:44:59.486250 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.486221 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:44:59.924495 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.924383 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:44:59.486246556Z","UUID":"89731bd6-5d42-4df3-8bef-b09047e0bce1","Handler":null,"Name":"","Endpoint":""} Mar 18 16:44:59.927052 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.927028 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:44:59.927052 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.927057 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:44:59.993292 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.993266 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:44:59.993450 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:59.993423 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:44:59.993531 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:44:59.993480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:44:59.993623 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:44:59.993598 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:45:00.127680 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:00.127647 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-shkth" event={"ID":"70516603-846e-4c1f-80db-9b409ecc2c96","Type":"ContainerStarted","Data":"d07854632a99859aa215cb4462af51df749cdd6f456dbff830ea573904f04960"} Mar 18 16:45:00.130717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:00.130686 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" event={"ID":"f41d5653-51e9-4c4c-9238-8623342b9fb5","Type":"ContainerStarted","Data":"1c1dd348ae613d83211454e7b64d5171d73acb39a0d1e92e67fa7a67fa2b9b0d"} Mar 18 16:45:00.143151 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:00.143076 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x499v" podStartSLOduration=4.578047096 podStartE2EDuration="22.143059176s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:40.553794152 +0000 UTC m=+3.138568162" lastFinishedPulling="2026-03-18 16:44:58.118806223 +0000 UTC m=+20.703580242" observedRunningTime="2026-03-18 16:44:59.230329919 +0000 UTC m=+21.815103946" watchObservedRunningTime="2026-03-18 16:45:00.143059176 +0000 UTC m=+22.727833204" Mar 18 16:45:01.135974 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:01.135808 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-acl-logging/0.log" Mar 18 16:45:01.136533 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:01.136506 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" event={"ID":"d1ec73fb-e596-4c69-abc5-b3073ed73133","Type":"ContainerStarted","Data":"877fd3ecc10a9efc9e0dfd08e9452bd7646056db805e484195d8f90b828a2514"} Mar 18 16:45:01.138839 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:01.138811 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" event={"ID":"f41d5653-51e9-4c4c-9238-8623342b9fb5","Type":"ContainerStarted","Data":"57013dad4a8eccd85afe9aa669b0cae6b4df32e581ce904c82cabc5eb70cfe80"} Mar 18 16:45:01.167975 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:01.167914 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-shkth" podStartSLOduration=5.577631811 podStartE2EDuration="23.167895264s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:40.552439266 +0000 UTC m=+3.137213273" lastFinishedPulling="2026-03-18 16:44:58.142702721 +0000 UTC m=+20.727476726" observedRunningTime="2026-03-18 16:45:00.143880362 +0000 UTC m=+22.728654387" watchObservedRunningTime="2026-03-18 16:45:01.167895264 +0000 UTC m=+23.752669296" Mar 18 16:45:01.993574 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:01.993509 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:45:01.993574 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:01.993551 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:45:01.993846 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:01.993652 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:45:01.993846 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:01.993783 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:45:03.128875 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:03.128850 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:45:03.129480 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:03.129464 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:45:03.145733 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:03.145589 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-acl-logging/0.log" Mar 18 16:45:03.145733 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:03.145581 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9kkl" podStartSLOduration=5.113986566 podStartE2EDuration="25.145567281s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:40.549651185 +0000 UTC m=+3.134425194" lastFinishedPulling="2026-03-18 16:45:00.581231901 +0000 UTC m=+23.166005909" observedRunningTime="2026-03-18 16:45:01.167794085 +0000 UTC m=+23.752568112" watchObservedRunningTime="2026-03-18 16:45:03.145567281 +0000 UTC m=+25.730341308" Mar 18 16:45:03.146269 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:03.145982 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" event={"ID":"d1ec73fb-e596-4c69-abc5-b3073ed73133","Type":"ContainerStarted","Data":"c04338fcfca86f42402c7c894d6ccc6ff21c2a985a3a87af419bce489ad1960d"} Mar 18 16:45:03.992858 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:03.992628 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:45:03.993084 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:03.992628 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:45:03.993084 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:03.992960 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:45:03.993084 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:03.993014 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:45:04.148396 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:04.148363 2578 generic.go:358] "Generic (PLEG): container finished" podID="c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf" containerID="550574e0e4d31eb41bdab142a3ce140055da6114c6df5a2e1031b3e08c1d28d6" exitCode=0 Mar 18 16:45:04.148848 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:04.148439 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d78r7" event={"ID":"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf","Type":"ContainerDied","Data":"550574e0e4d31eb41bdab142a3ce140055da6114c6df5a2e1031b3e08c1d28d6"} Mar 18 16:45:04.148848 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:04.148785 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:45:04.148972 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:04.148946 2578 scope.go:117] "RemoveContainer" containerID="bc410bebde824f0b999763c5351e90a8a319e8ebc75ad716b3e755b37385ce61" Mar 18 16:45:04.164854 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:04.164834 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:45:05.050877 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:05.050830 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jcvjp"] Mar 18 16:45:05.050969 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:05.050951 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:45:05.051050 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:05.051033 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:45:05.053484 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:05.053456 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b5628"] Mar 18 16:45:05.053591 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:05.053578 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:45:05.053694 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:05.053676 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:45:05.153755 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:05.153737 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-acl-logging/0.log" Mar 18 16:45:05.154082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:05.154057 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" event={"ID":"d1ec73fb-e596-4c69-abc5-b3073ed73133","Type":"ContainerStarted","Data":"58eb4fc2e9c219d5d98b605e67bb3c544eb996371cf57c31c4adc59d9f9e1dd9"} Mar 18 16:45:05.154331 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:05.154310 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:45:05.154331 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:05.154333 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:45:05.170371 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:05.170347 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:45:05.185640 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:05.185590 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" podStartSLOduration=9.542446398 podStartE2EDuration="27.185569984s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:40.550264943 +0000 UTC m=+3.135038955" lastFinishedPulling="2026-03-18 16:44:58.193388514 +0000 UTC m=+20.778162541" observedRunningTime="2026-03-18 16:45:05.184669367 +0000 UTC m=+27.769443447" watchObservedRunningTime="2026-03-18 16:45:05.185569984 +0000 UTC m=+27.770344012" Mar 18 16:45:06.157398 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:06.157209 2578 generic.go:358] "Generic (PLEG): container finished" podID="c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf" containerID="c9cb0d92c47167296f95ad1895d165c8e671fcf0cb4f11897e870eac2ba36794" exitCode=0 Mar 18 16:45:06.157841 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:06.157269 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d78r7" event={"ID":"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf","Type":"ContainerDied","Data":"c9cb0d92c47167296f95ad1895d165c8e671fcf0cb4f11897e870eac2ba36794"} Mar 18 16:45:06.992654 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:06.992622 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:45:06.992654 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:06.992641 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:45:06.992847 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:06.992724 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:45:06.992905 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:06.992835 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:45:07.164298 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:07.164264 2578 generic.go:358] "Generic (PLEG): container finished" podID="c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf" containerID="4d17e3b7f31bef587f0654d26d856bbd304e4190bf7b9d4747d93cabe1485837" exitCode=0 Mar 18 16:45:07.165068 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:07.165042 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d78r7" event={"ID":"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf","Type":"ContainerDied","Data":"4d17e3b7f31bef587f0654d26d856bbd304e4190bf7b9d4747d93cabe1485837"} Mar 18 16:45:08.167001 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:08.166958 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:45:08.167620 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:08.167129 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 16:45:08.167620 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:08.167564 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ttdbk" Mar 18 16:45:08.992881 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:08.992802 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:45:08.992881 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:08.992825 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:45:08.993075 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:08.992918 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b5628" podUID="4e172408-4d26-4b03-a0eb-bfcb801cdadc" Mar 18 16:45:08.993156 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:08.993077 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:45:10.228831 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.228802 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-255.ec2.internal" event="NodeReady" Mar 18 16:45:10.229407 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.228926 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:45:10.277565 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.277539 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pb5h9"] Mar 18 16:45:10.304327 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.304301 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6f4h4"] Mar 18 16:45:10.304509 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.304489 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.307066 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.307040 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:45:10.307214 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.307080 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wnp77\"" Mar 18 16:45:10.307356 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.307338 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:45:10.331701 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.331677 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pb5h9"] Mar 18 16:45:10.331701 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.331714 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6f4h4"] Mar 18 16:45:10.331936 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.331817 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:10.333683 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.333662 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:45:10.333948 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.333919 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:45:10.334309 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.334290 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dz9s5\"" Mar 18 16:45:10.334560 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.334540 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:45:10.402238 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.402216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:10.402368 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.402254 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhph9\" (UniqueName: \"kubernetes.io/projected/0972e2c0-041f-46c3-8440-60ac3028c22d-kube-api-access-dhph9\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:10.402368 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.402285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0690d7d0-de95-4cec-9e24-53b54d9b232d-tmp-dir\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.402368 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.402336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjkxq\" (UniqueName: \"kubernetes.io/projected/0690d7d0-de95-4cec-9e24-53b54d9b232d-kube-api-access-xjkxq\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.402472 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.402437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0690d7d0-de95-4cec-9e24-53b54d9b232d-config-volume\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.402472 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.402463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.503622 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.503548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0690d7d0-de95-4cec-9e24-53b54d9b232d-config-volume\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.503622 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.503587 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.503826 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:10.503708 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:10.503826 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.503733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:10.503826 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:10.503782 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls podName:0690d7d0-de95-4cec-9e24-53b54d9b232d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:11.003761524 +0000 UTC m=+33.588535532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls") pod "dns-default-pb5h9" (UID: "0690d7d0-de95-4cec-9e24-53b54d9b232d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:10.503826 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.503803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhph9\" (UniqueName: \"kubernetes.io/projected/0972e2c0-041f-46c3-8440-60ac3028c22d-kube-api-access-dhph9\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:10.504037 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:10.503828 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:10.504037 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.503840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0690d7d0-de95-4cec-9e24-53b54d9b232d-tmp-dir\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.504037 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:10.503872 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert podName:0972e2c0-041f-46c3-8440-60ac3028c22d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:11.003860203 +0000 UTC m=+33.588634207 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert") pod "ingress-canary-6f4h4" (UID: "0972e2c0-041f-46c3-8440-60ac3028c22d") : secret "canary-serving-cert" not found Mar 18 16:45:10.504037 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.503891 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkxq\" (UniqueName: \"kubernetes.io/projected/0690d7d0-de95-4cec-9e24-53b54d9b232d-kube-api-access-xjkxq\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.504261 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.504130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0690d7d0-de95-4cec-9e24-53b54d9b232d-tmp-dir\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.504261 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.504184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0690d7d0-de95-4cec-9e24-53b54d9b232d-config-volume\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.516386 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.516363 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjkxq\" (UniqueName: \"kubernetes.io/projected/0690d7d0-de95-4cec-9e24-53b54d9b232d-kube-api-access-xjkxq\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:10.516495 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.516478 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhph9\" (UniqueName: \"kubernetes.io/projected/0972e2c0-041f-46c3-8440-60ac3028c22d-kube-api-access-dhph9\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:10.993284 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.993247 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:45:10.993284 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.993280 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:45:10.995757 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.995736 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:45:10.996269 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.996250 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w8h72\"" Mar 18 16:45:10.996269 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.996263 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:45:10.996443 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.996278 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:45:10.996443 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:10.996254 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-t48rq\"" Mar 18 16:45:11.006667 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:11.006647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:11.006770 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:11.006681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:11.006846 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:11.006780 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:11.006846 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:11.006834 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert podName:0972e2c0-041f-46c3-8440-60ac3028c22d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:12.006816667 +0000 UTC m=+34.591590676 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert") pod "ingress-canary-6f4h4" (UID: "0972e2c0-041f-46c3-8440-60ac3028c22d") : secret "canary-serving-cert" not found Mar 18 16:45:11.006846 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:11.006837 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:11.007005 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:11.006909 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls podName:0690d7d0-de95-4cec-9e24-53b54d9b232d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:12.006894481 +0000 UTC m=+34.591668503 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls") pod "dns-default-pb5h9" (UID: "0690d7d0-de95-4cec-9e24-53b54d9b232d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:11.611021 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:11.610979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:45:11.611665 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:11.611198 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:11.611665 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:11.611274 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs podName:474e4d0f-dbfc-41a4-ad8f-fcada6a1b880 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:43.611254008 +0000 UTC m=+66.196028057 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs") pod "network-metrics-daemon-jcvjp" (UID: "474e4d0f-dbfc-41a4-ad8f-fcada6a1b880") : secret "metrics-daemon-secret" not found Mar 18 16:45:11.711739 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:11.711701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6czkt\" (UniqueName: \"kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt\") pod \"network-check-target-b5628\" (UID: \"4e172408-4d26-4b03-a0eb-bfcb801cdadc\") " pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:45:11.714637 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:11.714608 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6czkt\" (UniqueName: \"kubernetes.io/projected/4e172408-4d26-4b03-a0eb-bfcb801cdadc-kube-api-access-6czkt\") pod \"network-check-target-b5628\" (UID: \"4e172408-4d26-4b03-a0eb-bfcb801cdadc\") " pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:45:11.904834 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:11.904806 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:45:12.014324 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:12.014293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:12.014496 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:12.014379 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:12.014496 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:12.014452 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:12.014496 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:12.014452 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:12.014629 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:12.014515 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls podName:0690d7d0-de95-4cec-9e24-53b54d9b232d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:14.014497205 +0000 UTC m=+36.599271209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls") pod "dns-default-pb5h9" (UID: "0690d7d0-de95-4cec-9e24-53b54d9b232d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:12.014629 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:12.014534 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert podName:0972e2c0-041f-46c3-8440-60ac3028c22d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:14.01452553 +0000 UTC m=+36.599299543 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert") pod "ingress-canary-6f4h4" (UID: "0972e2c0-041f-46c3-8440-60ac3028c22d") : secret "canary-serving-cert" not found Mar 18 16:45:12.789630 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:12.789466 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b5628"] Mar 18 16:45:12.792553 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:45:12.792522 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e172408_4d26_4b03_a0eb_bfcb801cdadc.slice/crio-83e6868c77147ae53b0b81bccbbed47163fa999bb3234366d78f472b30c0ca8d WatchSource:0}: Error finding container 83e6868c77147ae53b0b81bccbbed47163fa999bb3234366d78f472b30c0ca8d: Status 404 returned error can't find the container with id 83e6868c77147ae53b0b81bccbbed47163fa999bb3234366d78f472b30c0ca8d Mar 18 16:45:13.178528 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:13.178499 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d78r7" event={"ID":"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf","Type":"ContainerStarted","Data":"6796d5877c2e30d5e21d06141be1a50dcb1d4f5360d2f4660368039f067a591d"} Mar 18 16:45:13.179535 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:13.179510 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b5628" event={"ID":"4e172408-4d26-4b03-a0eb-bfcb801cdadc","Type":"ContainerStarted","Data":"83e6868c77147ae53b0b81bccbbed47163fa999bb3234366d78f472b30c0ca8d"} Mar 18 16:45:14.027036 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:14.027002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:14.027541 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:14.027050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:14.027541 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:14.027168 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:14.027541 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:14.027176 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:14.027541 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:14.027224 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert podName:0972e2c0-041f-46c3-8440-60ac3028c22d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:18.027206137 +0000 UTC m=+40.611980140 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert") pod "ingress-canary-6f4h4" (UID: "0972e2c0-041f-46c3-8440-60ac3028c22d") : secret "canary-serving-cert" not found Mar 18 16:45:14.027541 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:14.027237 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls podName:0690d7d0-de95-4cec-9e24-53b54d9b232d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:18.027231273 +0000 UTC m=+40.612005277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls") pod "dns-default-pb5h9" (UID: "0690d7d0-de95-4cec-9e24-53b54d9b232d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:14.184990 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:14.184956 2578 generic.go:358] "Generic (PLEG): container finished" podID="c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf" containerID="6796d5877c2e30d5e21d06141be1a50dcb1d4f5360d2f4660368039f067a591d" exitCode=0 Mar 18 16:45:14.185149 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:14.185011 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d78r7" event={"ID":"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf","Type":"ContainerDied","Data":"6796d5877c2e30d5e21d06141be1a50dcb1d4f5360d2f4660368039f067a591d"} Mar 18 16:45:15.191572 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:15.191534 2578 generic.go:358] "Generic (PLEG): container finished" podID="c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf" containerID="ee4090ad1116dbd7875ba19a6866e59a0972b3e168f7c6a104cb233fe2716a7d" exitCode=0 Mar 18 16:45:15.191981 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:15.191593 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d78r7" event={"ID":"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf","Type":"ContainerDied","Data":"ee4090ad1116dbd7875ba19a6866e59a0972b3e168f7c6a104cb233fe2716a7d"} Mar 18 16:45:16.194972 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:16.194934 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b5628" event={"ID":"4e172408-4d26-4b03-a0eb-bfcb801cdadc","Type":"ContainerStarted","Data":"43870098eca619b1c3dcad2dcb41b0eda62addee986417f298774c67ea673141"} Mar 18 16:45:16.195422 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:16.195007 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:45:16.197625 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:16.197603 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d78r7" event={"ID":"c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf","Type":"ContainerStarted","Data":"921d008c3bd6fa142067c5f5a9ef939545468d244982398519913d6d64d6fc03"} Mar 18 16:45:16.215874 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:16.215835 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b5628" podStartSLOduration=35.001231882 podStartE2EDuration="38.215822764s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:45:12.794380566 +0000 UTC m=+35.379154570" lastFinishedPulling="2026-03-18 16:45:16.008971447 +0000 UTC m=+38.593745452" observedRunningTime="2026-03-18 16:45:16.215336106 +0000 UTC m=+38.800110133" watchObservedRunningTime="2026-03-18 16:45:16.215822764 +0000 UTC m=+38.800596980" Mar 18 16:45:16.238060 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:16.238009 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d78r7" podStartSLOduration=5.770314898 podStartE2EDuration="38.237992796s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:44:40.545690867 +0000 UTC m=+3.130464875" lastFinishedPulling="2026-03-18 16:45:13.013368769 +0000 UTC m=+35.598142773" observedRunningTime="2026-03-18 16:45:16.23711728 +0000 UTC m=+38.821891302" watchObservedRunningTime="2026-03-18 16:45:16.237992796 +0000 UTC m=+38.822766821" Mar 18 16:45:18.056273 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:18.056239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:18.056273 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:18.056280 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:18.056773 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:18.056369 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:18.056773 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:18.056371 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:18.056773 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:18.056415 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert podName:0972e2c0-041f-46c3-8440-60ac3028c22d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:26.056402438 +0000 UTC m=+48.641176442 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert") pod "ingress-canary-6f4h4" (UID: "0972e2c0-041f-46c3-8440-60ac3028c22d") : secret "canary-serving-cert" not found Mar 18 16:45:18.056773 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:18.056428 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls podName:0690d7d0-de95-4cec-9e24-53b54d9b232d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:26.056422099 +0000 UTC m=+48.641196103 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls") pod "dns-default-pb5h9" (UID: "0690d7d0-de95-4cec-9e24-53b54d9b232d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:26.110662 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:26.110569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:26.111153 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:26.110695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:26.111153 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:26.110642 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:26.111153 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:26.110774 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls podName:0690d7d0-de95-4cec-9e24-53b54d9b232d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:42.110760815 +0000 UTC m=+64.695534820 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls") pod "dns-default-pb5h9" (UID: "0690d7d0-de95-4cec-9e24-53b54d9b232d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:26.111153 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:26.110790 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:26.111153 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:26.110830 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert podName:0972e2c0-041f-46c3-8440-60ac3028c22d nodeName:}" failed. No retries permitted until 2026-03-18 16:45:42.110819879 +0000 UTC m=+64.695593883 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert") pod "ingress-canary-6f4h4" (UID: "0972e2c0-041f-46c3-8440-60ac3028c22d") : secret "canary-serving-cert" not found Mar 18 16:45:37.176308 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:37.176279 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hjd4v" Mar 18 16:45:42.120783 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:42.120744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:45:42.120783 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:42.120789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:45:42.121264 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:42.120893 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:42.121264 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:42.120972 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls podName:0690d7d0-de95-4cec-9e24-53b54d9b232d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:14.120956298 +0000 UTC m=+96.705730302 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls") pod "dns-default-pb5h9" (UID: "0690d7d0-de95-4cec-9e24-53b54d9b232d") : secret "dns-default-metrics-tls" not found Mar 18 16:45:42.121264 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:42.120896 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:42.121264 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:42.121041 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert podName:0972e2c0-041f-46c3-8440-60ac3028c22d nodeName:}" failed. No retries permitted until 2026-03-18 16:46:14.121028925 +0000 UTC m=+96.705802928 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert") pod "ingress-canary-6f4h4" (UID: "0972e2c0-041f-46c3-8440-60ac3028c22d") : secret "canary-serving-cert" not found Mar 18 16:45:43.629267 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:43.629230 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:45:43.629711 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:43.629401 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:43.629711 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:45:43.629482 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs podName:474e4d0f-dbfc-41a4-ad8f-fcada6a1b880 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:47.629460981 +0000 UTC m=+130.214234986 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs") pod "network-metrics-daemon-jcvjp" (UID: "474e4d0f-dbfc-41a4-ad8f-fcada6a1b880") : secret "metrics-daemon-secret" not found Mar 18 16:45:47.202051 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:45:47.202020 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b5628" Mar 18 16:46:14.128848 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:14.128701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:46:14.128848 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:14.128746 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:46:14.128848 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:14.128827 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:46:14.129421 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:14.128861 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:46:14.129421 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:14.128886 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert podName:0972e2c0-041f-46c3-8440-60ac3028c22d nodeName:}" failed. No retries permitted until 2026-03-18 16:47:18.128868381 +0000 UTC m=+160.713642385 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert") pod "ingress-canary-6f4h4" (UID: "0972e2c0-041f-46c3-8440-60ac3028c22d") : secret "canary-serving-cert" not found Mar 18 16:46:14.129421 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:14.128924 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls podName:0690d7d0-de95-4cec-9e24-53b54d9b232d nodeName:}" failed. No retries permitted until 2026-03-18 16:47:18.128907258 +0000 UTC m=+160.713681262 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls") pod "dns-default-pb5h9" (UID: "0690d7d0-de95-4cec-9e24-53b54d9b232d") : secret "dns-default-metrics-tls" not found Mar 18 16:46:47.661606 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:47.661569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:46:47.662050 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:47.661686 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:46:47.662050 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:47.661746 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs podName:474e4d0f-dbfc-41a4-ad8f-fcada6a1b880 nodeName:}" failed. No retries permitted until 2026-03-18 16:48:49.661732094 +0000 UTC m=+252.246506103 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs") pod "network-metrics-daemon-jcvjp" (UID: "474e4d0f-dbfc-41a4-ad8f-fcada6a1b880") : secret "metrics-daemon-secret" not found Mar 18 16:46:49.253513 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.253476 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk"] Mar 18 16:46:49.256273 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.256258 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:46:49.258352 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.258331 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jslwt\"" Mar 18 16:46:49.258597 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.258584 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:49.258916 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.258901 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 18 16:46:49.259626 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.259611 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:49.268930 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.268908 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk"] Mar 18 16:46:49.350537 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.350513 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n"] Mar 18 16:46:49.353232 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.353214 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n" Mar 18 16:46:49.354114 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.354071 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd"] Mar 18 16:46:49.355211 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.355192 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:49.355322 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.355232 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:49.355322 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.355287 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-bp5px\"" Mar 18 16:46:49.356465 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.356449 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-bv4qv"] Mar 18 16:46:49.356583 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.356567 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:49.358943 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.358919 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-km89d\"" Mar 18 16:46:49.359033 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.358963 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.359033 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.359020 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Mar 18 16:46:49.359440 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.359425 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:46:49.360358 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.360247 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:46:49.360358 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.360250 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Mar 18 16:46:49.361612 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.361596 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Mar 18 16:46:49.361612 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.361598 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Mar 18 16:46:49.362458 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.362441 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:46:49.362530 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.362475 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:46:49.362880 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.362865 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-vf69f\"" Mar 18 16:46:49.368414 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.368398 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n"] Mar 18 16:46:49.371166 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.371143 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd"] Mar 18 16:46:49.371257 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.371238 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Mar 18 16:46:49.371860 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.371843 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-bv4qv"] Mar 18 16:46:49.373086 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.373063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:46:49.373229 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.373212 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrgj7\" (UniqueName: \"kubernetes.io/projected/ef6b72f1-87c3-4a21-9882-6564d4d4d617-kube-api-access-qrgj7\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:46:49.452834 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.452802 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-d48cdb47c-jw9jp"] Mar 18 16:46:49.455502 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.455489 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.457280 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.457257 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 18 16:46:49.457532 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.457519 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 18 16:46:49.457795 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.457780 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 18 16:46:49.457858 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.457840 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8k48n\"" Mar 18 16:46:49.462614 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.462592 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 18 16:46:49.470011 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.469992 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d48cdb47c-jw9jp"] Mar 18 16:46:49.474269 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474250 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/143d5104-122a-4bd9-ac1e-35fce758029a-tmp\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.474362 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474282 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/73e43277-038f-4657-95b6-addae5fb597c-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:49.474362 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474308 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/143d5104-122a-4bd9-ac1e-35fce758029a-snapshots\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.474362 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474340 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrgj7\" (UniqueName: \"kubernetes.io/projected/ef6b72f1-87c3-4a21-9882-6564d4d4d617-kube-api-access-qrgj7\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:46:49.474539 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474381 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqtjs\" (UniqueName: \"kubernetes.io/projected/5ce47c73-a640-42a6-87a2-d2c7e5f304e2-kube-api-access-xqtjs\") pod \"volume-data-source-validator-67fdcb5769-wrk6n\" (UID: \"5ce47c73-a640-42a6-87a2-d2c7e5f304e2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n" Mar 18 16:46:49.474539 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474496 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/143d5104-122a-4bd9-ac1e-35fce758029a-service-ca-bundle\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.474645 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474536 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvgdp\" (UniqueName: \"kubernetes.io/projected/143d5104-122a-4bd9-ac1e-35fce758029a-kube-api-access-jvgdp\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.474645 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474605 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:46:49.474752 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474634 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143d5104-122a-4bd9-ac1e-35fce758029a-serving-cert\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.474752 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:49.474703 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:49.474752 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:49.474906 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:49.474764 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls podName:ef6b72f1-87c3-4a21-9882-6564d4d4d617 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:49.974742349 +0000 UTC m=+132.559516354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-jpkdk" (UID: "ef6b72f1-87c3-4a21-9882-6564d4d4d617") : secret "samples-operator-tls" not found Mar 18 16:46:49.474906 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vvht\" (UniqueName: \"kubernetes.io/projected/73e43277-038f-4657-95b6-addae5fb597c-kube-api-access-4vvht\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:49.474906 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.474816 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/143d5104-122a-4bd9-ac1e-35fce758029a-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.487061 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.487033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrgj7\" (UniqueName: \"kubernetes.io/projected/ef6b72f1-87c3-4a21-9882-6564d4d4d617-kube-api-access-qrgj7\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:46:49.575918 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.575854 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqtjs\" (UniqueName: \"kubernetes.io/projected/5ce47c73-a640-42a6-87a2-d2c7e5f304e2-kube-api-access-xqtjs\") pod \"volume-data-source-validator-67fdcb5769-wrk6n\" (UID: \"5ce47c73-a640-42a6-87a2-d2c7e5f304e2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n" Mar 18 16:46:49.575918 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.575886 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/143d5104-122a-4bd9-ac1e-35fce758029a-service-ca-bundle\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.575918 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.575906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-ca-trust-extracted\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.576151 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.575921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvgdp\" (UniqueName: \"kubernetes.io/projected/143d5104-122a-4bd9-ac1e-35fce758029a-kube-api-access-jvgdp\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.576151 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.575937 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.576151 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143d5104-122a-4bd9-ac1e-35fce758029a-serving-cert\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.576306 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576150 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-trusted-ca\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.576306 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:49.576306 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vvht\" (UniqueName: \"kubernetes.io/projected/73e43277-038f-4657-95b6-addae5fb597c-kube-api-access-4vvht\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:49.576306 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576248 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/143d5104-122a-4bd9-ac1e-35fce758029a-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.576306 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576269 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-installation-pull-secrets\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.576306 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:49.576289 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:49.576306 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576307 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-bound-sa-token\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.576615 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/143d5104-122a-4bd9-ac1e-35fce758029a-tmp\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.576615 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:49.576345 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls podName:73e43277-038f-4657-95b6-addae5fb597c nodeName:}" failed. No retries permitted until 2026-03-18 16:46:50.076327909 +0000 UTC m=+132.661101914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-jh7zd" (UID: "73e43277-038f-4657-95b6-addae5fb597c") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:49.576615 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576411 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-certificates\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.576615 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/73e43277-038f-4657-95b6-addae5fb597c-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:49.576615 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/143d5104-122a-4bd9-ac1e-35fce758029a-snapshots\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.576615 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-image-registry-private-configuration\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.576615 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576527 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/143d5104-122a-4bd9-ac1e-35fce758029a-service-ca-bundle\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.576615 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.576532 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfhfr\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-kube-api-access-rfhfr\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.577170 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.577148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/73e43277-038f-4657-95b6-addae5fb597c-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:49.577244 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.577227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/143d5104-122a-4bd9-ac1e-35fce758029a-tmp\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.577298 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.577274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/143d5104-122a-4bd9-ac1e-35fce758029a-snapshots\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.577489 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.577471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/143d5104-122a-4bd9-ac1e-35fce758029a-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.578893 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.578877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143d5104-122a-4bd9-ac1e-35fce758029a-serving-cert\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.585824 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.585803 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqtjs\" (UniqueName: \"kubernetes.io/projected/5ce47c73-a640-42a6-87a2-d2c7e5f304e2-kube-api-access-xqtjs\") pod \"volume-data-source-validator-67fdcb5769-wrk6n\" (UID: \"5ce47c73-a640-42a6-87a2-d2c7e5f304e2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n" Mar 18 16:46:49.585928 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.585912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vvht\" (UniqueName: \"kubernetes.io/projected/73e43277-038f-4657-95b6-addae5fb597c-kube-api-access-4vvht\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:49.586075 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.586055 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvgdp\" (UniqueName: \"kubernetes.io/projected/143d5104-122a-4bd9-ac1e-35fce758029a-kube-api-access-jvgdp\") pod \"insights-operator-76bdd9f478-bv4qv\" (UID: \"143d5104-122a-4bd9-ac1e-35fce758029a\") " pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.663666 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.663650 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n" Mar 18 16:46:49.676392 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.676370 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" Mar 18 16:46:49.677002 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.676874 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-bound-sa-token\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.677002 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.676901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-certificates\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.677002 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.676923 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-image-registry-private-configuration\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.677002 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.676948 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfhfr\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-kube-api-access-rfhfr\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.677002 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.676993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-ca-trust-extracted\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.677293 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.677018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.677293 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.677065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-trusted-ca\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.677293 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.677136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-installation-pull-secrets\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.677542 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.677520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-certificates\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.677977 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:49.677955 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:49.677977 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:49.677978 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d48cdb47c-jw9jp: secret "image-registry-tls" not found Mar 18 16:46:49.678140 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:49.678052 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls podName:80a5817b-7fb9-49f5-b276-9dc6d1d1f924 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:50.178031489 +0000 UTC m=+132.762805493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls") pod "image-registry-d48cdb47c-jw9jp" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924") : secret "image-registry-tls" not found Mar 18 16:46:49.678504 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.678482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-trusted-ca\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.678581 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.678535 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-ca-trust-extracted\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.679462 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.679440 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-image-registry-private-configuration\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.679602 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.679587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-installation-pull-secrets\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.686590 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.686547 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-bound-sa-token\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.687044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.687024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfhfr\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-kube-api-access-rfhfr\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:49.785627 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.785597 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n"] Mar 18 16:46:49.788961 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:46:49.788935 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ce47c73_a640_42a6_87a2_d2c7e5f304e2.slice/crio-3c3563988756ab9df2c2f839f9bfcee73fd8c344a3da43014054ef9ccc771327 WatchSource:0}: Error finding container 3c3563988756ab9df2c2f839f9bfcee73fd8c344a3da43014054ef9ccc771327: Status 404 returned error can't find the container with id 3c3563988756ab9df2c2f839f9bfcee73fd8c344a3da43014054ef9ccc771327 Mar 18 16:46:49.801535 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.801512 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-bv4qv"] Mar 18 16:46:49.804343 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:46:49.804315 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod143d5104_122a_4bd9_ac1e_35fce758029a.slice/crio-e20d919bf7d4f6c79f6e34273191fa3d2a1d6d8309a809945a28b4f9276b3b2c WatchSource:0}: Error finding container e20d919bf7d4f6c79f6e34273191fa3d2a1d6d8309a809945a28b4f9276b3b2c: Status 404 returned error can't find the container with id e20d919bf7d4f6c79f6e34273191fa3d2a1d6d8309a809945a28b4f9276b3b2c Mar 18 16:46:49.979961 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:49.979931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:46:49.980139 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:49.980068 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:49.980185 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:49.980145 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls podName:ef6b72f1-87c3-4a21-9882-6564d4d4d617 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:50.980130496 +0000 UTC m=+133.564904499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-jpkdk" (UID: "ef6b72f1-87c3-4a21-9882-6564d4d4d617") : secret "samples-operator-tls" not found Mar 18 16:46:50.080479 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:50.080444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:50.080664 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:50.080565 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:50.080664 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:50.080616 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls podName:73e43277-038f-4657-95b6-addae5fb597c nodeName:}" failed. No retries permitted until 2026-03-18 16:46:51.080602007 +0000 UTC m=+133.665376010 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-jh7zd" (UID: "73e43277-038f-4657-95b6-addae5fb597c") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:50.181076 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:50.181043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:50.181234 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:50.181195 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:50.181234 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:50.181213 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d48cdb47c-jw9jp: secret "image-registry-tls" not found Mar 18 16:46:50.181298 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:50.181266 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls podName:80a5817b-7fb9-49f5-b276-9dc6d1d1f924 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:51.181251788 +0000 UTC m=+133.766025797 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls") pod "image-registry-d48cdb47c-jw9jp" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924") : secret "image-registry-tls" not found Mar 18 16:46:50.369846 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:50.369755 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" event={"ID":"143d5104-122a-4bd9-ac1e-35fce758029a","Type":"ContainerStarted","Data":"e20d919bf7d4f6c79f6e34273191fa3d2a1d6d8309a809945a28b4f9276b3b2c"} Mar 18 16:46:50.370973 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:50.370946 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n" event={"ID":"5ce47c73-a640-42a6-87a2-d2c7e5f304e2","Type":"ContainerStarted","Data":"3c3563988756ab9df2c2f839f9bfcee73fd8c344a3da43014054ef9ccc771327"} Mar 18 16:46:50.987434 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:50.987402 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:46:50.987587 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:50.987569 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:50.987650 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:50.987635 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls podName:ef6b72f1-87c3-4a21-9882-6564d4d4d617 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:52.987619411 +0000 UTC m=+135.572393415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-jpkdk" (UID: "ef6b72f1-87c3-4a21-9882-6564d4d4d617") : secret "samples-operator-tls" not found Mar 18 16:46:51.088628 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:51.088591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:51.088806 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:51.088728 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:51.088806 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:51.088801 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls podName:73e43277-038f-4657-95b6-addae5fb597c nodeName:}" failed. No retries permitted until 2026-03-18 16:46:53.088782269 +0000 UTC m=+135.673556286 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-jh7zd" (UID: "73e43277-038f-4657-95b6-addae5fb597c") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:51.189347 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:51.189315 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:51.189502 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:51.189451 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:51.189502 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:51.189475 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d48cdb47c-jw9jp: secret "image-registry-tls" not found Mar 18 16:46:51.189584 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:51.189532 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls podName:80a5817b-7fb9-49f5-b276-9dc6d1d1f924 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:53.189516612 +0000 UTC m=+135.774290616 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls") pod "image-registry-d48cdb47c-jw9jp" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924") : secret "image-registry-tls" not found Mar 18 16:46:52.376166 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:52.376125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" event={"ID":"143d5104-122a-4bd9-ac1e-35fce758029a","Type":"ContainerStarted","Data":"843d98527978f02cf50344d6c870a766293dda341bd800ab8e971c4f7e0ee90c"} Mar 18 16:46:52.377529 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:52.377503 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n" event={"ID":"5ce47c73-a640-42a6-87a2-d2c7e5f304e2","Type":"ContainerStarted","Data":"98e2928a9c9501811494aea6dcf2772061dc141bb37547a14a50470c73f373ff"} Mar 18 16:46:52.393261 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:52.393221 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" podStartSLOduration=1.54573931 podStartE2EDuration="3.393208291s" podCreationTimestamp="2026-03-18 16:46:49 +0000 UTC" firstStartedPulling="2026-03-18 16:46:49.8059274 +0000 UTC m=+132.390701404" lastFinishedPulling="2026-03-18 16:46:51.653396368 +0000 UTC m=+134.238170385" observedRunningTime="2026-03-18 16:46:52.392488742 +0000 UTC m=+134.977262767" watchObservedRunningTime="2026-03-18 16:46:52.393208291 +0000 UTC m=+134.977982316" Mar 18 16:46:53.004472 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:53.004444 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:46:53.004753 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:53.004563 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:53.004753 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:53.004613 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls podName:ef6b72f1-87c3-4a21-9882-6564d4d4d617 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:57.004598382 +0000 UTC m=+139.589372385 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-jpkdk" (UID: "ef6b72f1-87c3-4a21-9882-6564d4d4d617") : secret "samples-operator-tls" not found Mar 18 16:46:53.105241 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:53.105207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:53.105396 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:53.105338 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:53.105451 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:53.105396 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls podName:73e43277-038f-4657-95b6-addae5fb597c nodeName:}" failed. No retries permitted until 2026-03-18 16:46:57.105381358 +0000 UTC m=+139.690155362 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-jh7zd" (UID: "73e43277-038f-4657-95b6-addae5fb597c") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:53.206025 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:53.205992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:53.206177 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:53.206133 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:53.206177 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:53.206145 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d48cdb47c-jw9jp: secret "image-registry-tls" not found Mar 18 16:46:53.206296 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:53.206206 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls podName:80a5817b-7fb9-49f5-b276-9dc6d1d1f924 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:57.20618684 +0000 UTC m=+139.790960847 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls") pod "image-registry-d48cdb47c-jw9jp" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924") : secret "image-registry-tls" not found Mar 18 16:46:54.080916 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.080853 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-wrk6n" podStartSLOduration=3.217496797 podStartE2EDuration="5.080833265s" podCreationTimestamp="2026-03-18 16:46:49 +0000 UTC" firstStartedPulling="2026-03-18 16:46:49.790572858 +0000 UTC m=+132.375346862" lastFinishedPulling="2026-03-18 16:46:51.653909325 +0000 UTC m=+134.238683330" observedRunningTime="2026-03-18 16:46:52.406387405 +0000 UTC m=+134.991161449" watchObservedRunningTime="2026-03-18 16:46:54.080833265 +0000 UTC m=+136.665607280" Mar 18 16:46:54.081866 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.081848 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv"] Mar 18 16:46:54.084912 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.084898 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv" Mar 18 16:46:54.086801 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.086776 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:54.086912 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.086798 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-92ml7\"" Mar 18 16:46:54.087148 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.087130 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 18 16:46:54.093531 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.093512 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv"] Mar 18 16:46:54.212572 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.212541 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8xp2\" (UniqueName: \"kubernetes.io/projected/bd635b95-ab3b-4d16-951c-a3b3f97a8996-kube-api-access-j8xp2\") pod \"migrator-6b589cdcc-4d8qv\" (UID: \"bd635b95-ab3b-4d16-951c-a3b3f97a8996\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv" Mar 18 16:46:54.312962 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.312922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8xp2\" (UniqueName: \"kubernetes.io/projected/bd635b95-ab3b-4d16-951c-a3b3f97a8996-kube-api-access-j8xp2\") pod \"migrator-6b589cdcc-4d8qv\" (UID: \"bd635b95-ab3b-4d16-951c-a3b3f97a8996\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv" Mar 18 16:46:54.320633 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.320603 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8xp2\" (UniqueName: \"kubernetes.io/projected/bd635b95-ab3b-4d16-951c-a3b3f97a8996-kube-api-access-j8xp2\") pod \"migrator-6b589cdcc-4d8qv\" (UID: \"bd635b95-ab3b-4d16-951c-a3b3f97a8996\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv" Mar 18 16:46:54.393368 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.393342 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv" Mar 18 16:46:54.502413 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:54.502379 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv"] Mar 18 16:46:54.505608 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:46:54.505578 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd635b95_ab3b_4d16_951c_a3b3f97a8996.slice/crio-69797a49c98cd815383e83b8ab32d8c3ba1ef4da986811c6a32c4f3297100f5f WatchSource:0}: Error finding container 69797a49c98cd815383e83b8ab32d8c3ba1ef4da986811c6a32c4f3297100f5f: Status 404 returned error can't find the container with id 69797a49c98cd815383e83b8ab32d8c3ba1ef4da986811c6a32c4f3297100f5f Mar 18 16:46:55.383936 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:55.383904 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv" event={"ID":"bd635b95-ab3b-4d16-951c-a3b3f97a8996","Type":"ContainerStarted","Data":"69797a49c98cd815383e83b8ab32d8c3ba1ef4da986811c6a32c4f3297100f5f"} Mar 18 16:46:55.450347 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:55.450320 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tvxnf_e48c47f8-3be3-4bee-bab5-5a2d007486f8/dns-node-resolver/0.log" Mar 18 16:46:56.387500 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:56.387466 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv" event={"ID":"bd635b95-ab3b-4d16-951c-a3b3f97a8996","Type":"ContainerStarted","Data":"00185ff781ab61774790c1f91b5b3e25414e5b119454920d4f858750c37c1d58"} Mar 18 16:46:56.387500 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:56.387501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv" event={"ID":"bd635b95-ab3b-4d16-951c-a3b3f97a8996","Type":"ContainerStarted","Data":"937f367b56a2254a8f30e042939835811094799b055331e3584d130d887840a8"} Mar 18 16:46:56.403319 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:56.403278 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-4d8qv" podStartSLOduration=1.273281409 podStartE2EDuration="2.403264275s" podCreationTimestamp="2026-03-18 16:46:54 +0000 UTC" firstStartedPulling="2026-03-18 16:46:54.507404985 +0000 UTC m=+137.092178989" lastFinishedPulling="2026-03-18 16:46:55.63738785 +0000 UTC m=+138.222161855" observedRunningTime="2026-03-18 16:46:56.403080257 +0000 UTC m=+138.987854283" watchObservedRunningTime="2026-03-18 16:46:56.403264275 +0000 UTC m=+138.988038299" Mar 18 16:46:56.459520 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:56.459497 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x499v_bbb42fee-4a86-4fbc-b701-d582b093b57a/node-ca/0.log" Mar 18 16:46:57.034179 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:57.034145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:46:57.034341 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:57.034313 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 18 16:46:57.034415 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:57.034387 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls podName:ef6b72f1-87c3-4a21-9882-6564d4d4d617 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:05.034367342 +0000 UTC m=+147.619141366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-jpkdk" (UID: "ef6b72f1-87c3-4a21-9882-6564d4d4d617") : secret "samples-operator-tls" not found Mar 18 16:46:57.135106 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:57.135049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:46:57.135238 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:57.135189 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:57.135273 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:57.135247 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls podName:73e43277-038f-4657-95b6-addae5fb597c nodeName:}" failed. No retries permitted until 2026-03-18 16:47:05.13523286 +0000 UTC m=+147.720006868 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-jh7zd" (UID: "73e43277-038f-4657-95b6-addae5fb597c") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:46:57.236084 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:46:57.236053 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:46:57.236246 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:57.236199 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:57.236246 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:57.236217 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-d48cdb47c-jw9jp: secret "image-registry-tls" not found Mar 18 16:46:57.236332 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:46:57.236273 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls podName:80a5817b-7fb9-49f5-b276-9dc6d1d1f924 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:05.236258753 +0000 UTC m=+147.821032756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls") pod "image-registry-d48cdb47c-jw9jp" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924") : secret "image-registry-tls" not found Mar 18 16:47:05.097468 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:05.097431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:47:05.099840 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:05.099820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef6b72f1-87c3-4a21-9882-6564d4d4d617-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-jpkdk\" (UID: \"ef6b72f1-87c3-4a21-9882-6564d4d4d617\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:47:05.164322 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:05.164290 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" Mar 18 16:47:05.198319 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:05.198288 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:47:05.198456 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:47:05.198429 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 18 16:47:05.198510 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:47:05.198501 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls podName:73e43277-038f-4657-95b6-addae5fb597c nodeName:}" failed. No retries permitted until 2026-03-18 16:47:21.198484766 +0000 UTC m=+163.783258769 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-jh7zd" (UID: "73e43277-038f-4657-95b6-addae5fb597c") : secret "cluster-monitoring-operator-tls" not found Mar 18 16:47:05.277689 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:05.277655 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk"] Mar 18 16:47:05.298891 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:05.298859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:47:05.301667 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:05.301640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls\") pod \"image-registry-d48cdb47c-jw9jp\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:47:05.363424 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:05.363367 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:47:05.410449 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:05.410411 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" event={"ID":"ef6b72f1-87c3-4a21-9882-6564d4d4d617","Type":"ContainerStarted","Data":"526c611e2e956dde505552ae742058bcc62b9eee27607eaa7cacd829027293e5"} Mar 18 16:47:05.475871 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:05.475838 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-d48cdb47c-jw9jp"] Mar 18 16:47:05.479404 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:47:05.479378 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80a5817b_7fb9_49f5_b276_9dc6d1d1f924.slice/crio-fd6972f18d00e45c7002e7811c0318699585a76f11d9dd6c3d59a7123bf1f93d WatchSource:0}: Error finding container fd6972f18d00e45c7002e7811c0318699585a76f11d9dd6c3d59a7123bf1f93d: Status 404 returned error can't find the container with id fd6972f18d00e45c7002e7811c0318699585a76f11d9dd6c3d59a7123bf1f93d Mar 18 16:47:06.414417 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:06.414374 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" event={"ID":"80a5817b-7fb9-49f5-b276-9dc6d1d1f924","Type":"ContainerStarted","Data":"1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776"} Mar 18 16:47:06.414417 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:06.414421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" event={"ID":"80a5817b-7fb9-49f5-b276-9dc6d1d1f924","Type":"ContainerStarted","Data":"fd6972f18d00e45c7002e7811c0318699585a76f11d9dd6c3d59a7123bf1f93d"} Mar 18 16:47:06.414883 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:06.414562 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:47:06.433477 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:06.433427 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" podStartSLOduration=17.433411303 podStartE2EDuration="17.433411303s" podCreationTimestamp="2026-03-18 16:46:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:47:06.432161156 +0000 UTC m=+149.016935192" watchObservedRunningTime="2026-03-18 16:47:06.433411303 +0000 UTC m=+149.018185331" Mar 18 16:47:07.418669 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:07.418635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" event={"ID":"ef6b72f1-87c3-4a21-9882-6564d4d4d617","Type":"ContainerStarted","Data":"540bccd3cffab6cbcfc0e82c3eeef92040dc619958f658235a42e0dd0188e466"} Mar 18 16:47:07.418669 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:07.418670 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" event={"ID":"ef6b72f1-87c3-4a21-9882-6564d4d4d617","Type":"ContainerStarted","Data":"64a136123d05a18b3a8a85549854474562a3531a6080c67207d71ad371ba1142"} Mar 18 16:47:07.435956 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:07.435916 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-jpkdk" podStartSLOduration=17.029554294 podStartE2EDuration="18.435902383s" podCreationTimestamp="2026-03-18 16:46:49 +0000 UTC" firstStartedPulling="2026-03-18 16:47:05.322695394 +0000 UTC m=+147.907469397" lastFinishedPulling="2026-03-18 16:47:06.729043481 +0000 UTC m=+149.313817486" observedRunningTime="2026-03-18 16:47:07.435126483 +0000 UTC m=+150.019900532" watchObservedRunningTime="2026-03-18 16:47:07.435902383 +0000 UTC m=+150.020676408" Mar 18 16:47:13.315702 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:47:13.315659 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-pb5h9" podUID="0690d7d0-de95-4cec-9e24-53b54d9b232d" Mar 18 16:47:13.342010 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:47:13.341973 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6f4h4" podUID="0972e2c0-041f-46c3-8440-60ac3028c22d" Mar 18 16:47:13.432346 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:13.432311 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:47:13.432516 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:13.432465 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pb5h9" Mar 18 16:47:14.010466 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:47:14.010432 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jcvjp" podUID="474e4d0f-dbfc-41a4-ad8f-fcada6a1b880" Mar 18 16:47:18.198410 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.198373 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:47:18.198785 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.198488 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:47:18.200982 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.200958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0690d7d0-de95-4cec-9e24-53b54d9b232d-metrics-tls\") pod \"dns-default-pb5h9\" (UID: \"0690d7d0-de95-4cec-9e24-53b54d9b232d\") " pod="openshift-dns/dns-default-pb5h9" Mar 18 16:47:18.201156 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.201135 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0972e2c0-041f-46c3-8440-60ac3028c22d-cert\") pod \"ingress-canary-6f4h4\" (UID: \"0972e2c0-041f-46c3-8440-60ac3028c22d\") " pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:47:18.235108 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.235076 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dz9s5\"" Mar 18 16:47:18.235326 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.235309 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wnp77\"" Mar 18 16:47:18.243668 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.243649 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6f4h4" Mar 18 16:47:18.243668 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.243667 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pb5h9" Mar 18 16:47:18.375791 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.375768 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6f4h4"] Mar 18 16:47:18.378052 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:47:18.378016 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0972e2c0_041f_46c3_8440_60ac3028c22d.slice/crio-0089eb56b457b07699967067c68f529090f2bc7744d5ba20107bc38664e3e198 WatchSource:0}: Error finding container 0089eb56b457b07699967067c68f529090f2bc7744d5ba20107bc38664e3e198: Status 404 returned error can't find the container with id 0089eb56b457b07699967067c68f529090f2bc7744d5ba20107bc38664e3e198 Mar 18 16:47:18.396267 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.396241 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pb5h9"] Mar 18 16:47:18.400365 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:47:18.400338 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0690d7d0_de95_4cec_9e24_53b54d9b232d.slice/crio-880377aee580e135dbbef27d0bb3a8949ca5c0cc86bb25456c26349d96e7cd71 WatchSource:0}: Error finding container 880377aee580e135dbbef27d0bb3a8949ca5c0cc86bb25456c26349d96e7cd71: Status 404 returned error can't find the container with id 880377aee580e135dbbef27d0bb3a8949ca5c0cc86bb25456c26349d96e7cd71 Mar 18 16:47:18.442979 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.442947 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6f4h4" event={"ID":"0972e2c0-041f-46c3-8440-60ac3028c22d","Type":"ContainerStarted","Data":"0089eb56b457b07699967067c68f529090f2bc7744d5ba20107bc38664e3e198"} Mar 18 16:47:18.443786 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:18.443769 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pb5h9" event={"ID":"0690d7d0-de95-4cec-9e24-53b54d9b232d","Type":"ContainerStarted","Data":"880377aee580e135dbbef27d0bb3a8949ca5c0cc86bb25456c26349d96e7cd71"} Mar 18 16:47:19.146549 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.146506 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4bt94"] Mar 18 16:47:19.150067 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.149869 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.172277 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.172250 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:47:19.172277 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.172272 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fhwjh\"" Mar 18 16:47:19.172486 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.172364 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:47:19.202241 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.202194 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4bt94"] Mar 18 16:47:19.250051 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.249887 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d48cdb47c-jw9jp"] Mar 18 16:47:19.308259 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.308228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bab677fb-ab67-4c58-9ff4-1e1a862ff304-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.308418 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.308273 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkhgq\" (UniqueName: \"kubernetes.io/projected/bab677fb-ab67-4c58-9ff4-1e1a862ff304-kube-api-access-pkhgq\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.308418 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.308320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bab677fb-ab67-4c58-9ff4-1e1a862ff304-crio-socket\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.308418 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.308361 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bab677fb-ab67-4c58-9ff4-1e1a862ff304-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.308556 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.308461 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bab677fb-ab67-4c58-9ff4-1e1a862ff304-data-volume\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.409992 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.409650 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bab677fb-ab67-4c58-9ff4-1e1a862ff304-data-volume\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.409992 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.409730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bab677fb-ab67-4c58-9ff4-1e1a862ff304-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.409992 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.409763 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkhgq\" (UniqueName: \"kubernetes.io/projected/bab677fb-ab67-4c58-9ff4-1e1a862ff304-kube-api-access-pkhgq\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.409992 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.409780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bab677fb-ab67-4c58-9ff4-1e1a862ff304-crio-socket\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.409992 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.409809 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bab677fb-ab67-4c58-9ff4-1e1a862ff304-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.409992 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.409953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/bab677fb-ab67-4c58-9ff4-1e1a862ff304-crio-socket\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.410450 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.410183 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/bab677fb-ab67-4c58-9ff4-1e1a862ff304-data-volume\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.410505 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.410483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/bab677fb-ab67-4c58-9ff4-1e1a862ff304-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.413658 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.413634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/bab677fb-ab67-4c58-9ff4-1e1a862ff304-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.439483 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.439459 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkhgq\" (UniqueName: \"kubernetes.io/projected/bab677fb-ab67-4c58-9ff4-1e1a862ff304-kube-api-access-pkhgq\") pod \"insights-runtime-extractor-4bt94\" (UID: \"bab677fb-ab67-4c58-9ff4-1e1a862ff304\") " pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:19.462966 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:19.462623 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4bt94" Mar 18 16:47:20.287155 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:20.287049 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4bt94"] Mar 18 16:47:20.450577 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:20.450545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4bt94" event={"ID":"bab677fb-ab67-4c58-9ff4-1e1a862ff304","Type":"ContainerStarted","Data":"aa5cc89d07336c5e725b88f88a7b5242ad56f4001c60500d9c5482d6414318d3"} Mar 18 16:47:20.450577 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:20.450578 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4bt94" event={"ID":"bab677fb-ab67-4c58-9ff4-1e1a862ff304","Type":"ContainerStarted","Data":"bbceba1aba24238001e23f133d7b7da70f27cc89a6f40df3fbb779031851f1b6"} Mar 18 16:47:20.451804 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:20.451776 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6f4h4" event={"ID":"0972e2c0-041f-46c3-8440-60ac3028c22d","Type":"ContainerStarted","Data":"88e3030d8cdb00f21d96e1f3c768c397a30fbba3b2c61f32deb3e3b26f290ab0"} Mar 18 16:47:20.453203 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:20.453181 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pb5h9" event={"ID":"0690d7d0-de95-4cec-9e24-53b54d9b232d","Type":"ContainerStarted","Data":"a6ec677fd63adb0ef3c4cf905d3e977fc2bf53e07654a0b913dfc45d99a4045a"} Mar 18 16:47:20.453291 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:20.453211 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pb5h9" event={"ID":"0690d7d0-de95-4cec-9e24-53b54d9b232d","Type":"ContainerStarted","Data":"773bd4805a7ec414438b07a0da8a1d5f8cac8f103f687b62be4f342e4f9c298b"} Mar 18 16:47:20.453339 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:20.453315 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pb5h9" Mar 18 16:47:20.498098 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:20.498060 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6f4h4" podStartSLOduration=128.736242581 podStartE2EDuration="2m10.498047593s" podCreationTimestamp="2026-03-18 16:45:10 +0000 UTC" firstStartedPulling="2026-03-18 16:47:18.379941112 +0000 UTC m=+160.964715115" lastFinishedPulling="2026-03-18 16:47:20.141746109 +0000 UTC m=+162.726520127" observedRunningTime="2026-03-18 16:47:20.497305733 +0000 UTC m=+163.082079759" watchObservedRunningTime="2026-03-18 16:47:20.498047593 +0000 UTC m=+163.082821618" Mar 18 16:47:20.530612 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:20.530564 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pb5h9" podStartSLOduration=128.79529094 podStartE2EDuration="2m10.53054739s" podCreationTimestamp="2026-03-18 16:45:10 +0000 UTC" firstStartedPulling="2026-03-18 16:47:18.402013546 +0000 UTC m=+160.986787549" lastFinishedPulling="2026-03-18 16:47:20.137269991 +0000 UTC m=+162.722043999" observedRunningTime="2026-03-18 16:47:20.524941094 +0000 UTC m=+163.109715136" watchObservedRunningTime="2026-03-18 16:47:20.53054739 +0000 UTC m=+163.115321416" Mar 18 16:47:21.226744 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:21.226706 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:47:21.229392 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:21.229366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/73e43277-038f-4657-95b6-addae5fb597c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-jh7zd\" (UID: \"73e43277-038f-4657-95b6-addae5fb597c\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:47:21.461255 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:21.461218 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4bt94" event={"ID":"bab677fb-ab67-4c58-9ff4-1e1a862ff304","Type":"ContainerStarted","Data":"46524d94aee1ade7a3ecdcdf0bcc644c795e7950270232acad1016c1b401aacc"} Mar 18 16:47:21.472730 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:21.472702 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" Mar 18 16:47:21.629302 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:21.629280 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd"] Mar 18 16:47:21.631133 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:47:21.631105 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73e43277_038f_4657_95b6_addae5fb597c.slice/crio-63e9014ce0eaa088b9c03f1563b9b751dd6a75d0e329fd0837c39ed622164fc3 WatchSource:0}: Error finding container 63e9014ce0eaa088b9c03f1563b9b751dd6a75d0e329fd0837c39ed622164fc3: Status 404 returned error can't find the container with id 63e9014ce0eaa088b9c03f1563b9b751dd6a75d0e329fd0837c39ed622164fc3 Mar 18 16:47:22.465026 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:22.464990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" event={"ID":"73e43277-038f-4657-95b6-addae5fb597c","Type":"ContainerStarted","Data":"63e9014ce0eaa088b9c03f1563b9b751dd6a75d0e329fd0837c39ed622164fc3"} Mar 18 16:47:23.470249 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:23.470206 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4bt94" event={"ID":"bab677fb-ab67-4c58-9ff4-1e1a862ff304","Type":"ContainerStarted","Data":"33c77dd2e86ce58c052416628e6db76402f9af32406f84420fec26faf314c00a"} Mar 18 16:47:23.471596 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:23.471570 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" event={"ID":"73e43277-038f-4657-95b6-addae5fb597c","Type":"ContainerStarted","Data":"9af531d3293396a62f39685e13c389c1197e1be47c81ba0fdcd127f0dfc835f1"} Mar 18 16:47:23.524524 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:23.524468 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" podStartSLOduration=32.773276647 podStartE2EDuration="34.524449052s" podCreationTimestamp="2026-03-18 16:46:49 +0000 UTC" firstStartedPulling="2026-03-18 16:47:21.632759742 +0000 UTC m=+164.217533745" lastFinishedPulling="2026-03-18 16:47:23.383932145 +0000 UTC m=+165.968706150" observedRunningTime="2026-03-18 16:47:23.523928134 +0000 UTC m=+166.108702158" watchObservedRunningTime="2026-03-18 16:47:23.524449052 +0000 UTC m=+166.109223078" Mar 18 16:47:23.524718 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:23.524586 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4bt94" podStartSLOduration=1.99930283 podStartE2EDuration="4.524579751s" podCreationTimestamp="2026-03-18 16:47:19 +0000 UTC" firstStartedPulling="2026-03-18 16:47:20.366919691 +0000 UTC m=+162.951693694" lastFinishedPulling="2026-03-18 16:47:22.892196607 +0000 UTC m=+165.476970615" observedRunningTime="2026-03-18 16:47:23.504566511 +0000 UTC m=+166.089340546" watchObservedRunningTime="2026-03-18 16:47:23.524579751 +0000 UTC m=+166.109353774" Mar 18 16:47:27.994445 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:27.994417 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:47:29.255902 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:29.255869 2578 patch_prober.go:28] interesting pod/image-registry-d48cdb47c-jw9jp container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Mar 18 16:47:29.256286 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:29.255922 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" podUID="80a5817b-7fb9-49f5-b276-9dc6d1d1f924" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 16:47:30.463426 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:30.463396 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pb5h9" Mar 18 16:47:32.500067 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:32.500038 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-jh7zd_73e43277-038f-4657-95b6-addae5fb597c/cluster-monitoring-operator/0.log" Mar 18 16:47:32.500471 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:32.500078 2578 generic.go:358] "Generic (PLEG): container finished" podID="73e43277-038f-4657-95b6-addae5fb597c" containerID="9af531d3293396a62f39685e13c389c1197e1be47c81ba0fdcd127f0dfc835f1" exitCode=2 Mar 18 16:47:32.500471 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:32.500165 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" event={"ID":"73e43277-038f-4657-95b6-addae5fb597c","Type":"ContainerDied","Data":"9af531d3293396a62f39685e13c389c1197e1be47c81ba0fdcd127f0dfc835f1"} Mar 18 16:47:32.500471 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:32.500465 2578 scope.go:117] "RemoveContainer" containerID="9af531d3293396a62f39685e13c389c1197e1be47c81ba0fdcd127f0dfc835f1" Mar 18 16:47:33.504424 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:33.504399 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-jh7zd_73e43277-038f-4657-95b6-addae5fb597c/cluster-monitoring-operator/0.log" Mar 18 16:47:33.504753 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:33.504449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-jh7zd" event={"ID":"73e43277-038f-4657-95b6-addae5fb597c","Type":"ContainerStarted","Data":"4ff200913e95e34cfa6cec9cbd5a194f386700755ef0e231be58081c12e18c27"} Mar 18 16:47:36.237765 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.237738 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jzgqd"] Mar 18 16:47:36.241083 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.241058 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.242755 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.242736 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:47:36.242870 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.242842 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:47:36.243157 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.243141 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:47:36.243394 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.243378 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-g2m56\"" Mar 18 16:47:36.243468 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.243427 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:47:36.333495 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.333463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-sys\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.333495 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.333499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.333646 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.333520 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b98g7\" (UniqueName: \"kubernetes.io/projected/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-kube-api-access-b98g7\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.333646 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.333544 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-textfile\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.333646 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.333568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-root\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.333646 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.333591 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-tls\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.333777 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.333647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-wtmp\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.333777 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.333678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-metrics-client-ca\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.333777 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.333737 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-accelerators-collector-config\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435034 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-accelerators-collector-config\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435211 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435051 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-sys\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435211 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435211 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b98g7\" (UniqueName: \"kubernetes.io/projected/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-kube-api-access-b98g7\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435211 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435154 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-textfile\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435211 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-root\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435211 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435183 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-sys\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435211 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435205 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-tls\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435556 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435241 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-wtmp\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435556 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-metrics-client-ca\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435556 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435431 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-textfile\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435556 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-root\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435556 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:47:36.435516 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 18 16:47:36.435808 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435565 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-wtmp\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435808 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:47:36.435576 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-tls podName:d8b3bdea-0c2e-4fc9-a142-180b0451b57b nodeName:}" failed. No retries permitted until 2026-03-18 16:47:36.935557398 +0000 UTC m=+179.520331415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-tls") pod "node-exporter-jzgqd" (UID: "d8b3bdea-0c2e-4fc9-a142-180b0451b57b") : secret "node-exporter-tls" not found Mar 18 16:47:36.435808 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-accelerators-collector-config\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.435808 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.435771 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-metrics-client-ca\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.437783 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.437764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.443326 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.443301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b98g7\" (UniqueName: \"kubernetes.io/projected/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-kube-api-access-b98g7\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.939003 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.938972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-tls\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:36.941393 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:36.941367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8b3bdea-0c2e-4fc9-a142-180b0451b57b-node-exporter-tls\") pod \"node-exporter-jzgqd\" (UID: \"d8b3bdea-0c2e-4fc9-a142-180b0451b57b\") " pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:37.149687 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:37.149661 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jzgqd" Mar 18 16:47:37.158342 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:47:37.158317 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8b3bdea_0c2e_4fc9_a142_180b0451b57b.slice/crio-ea22f0290af3e7b18e02cdd6fdd692fd1c03be0741f60e9338677f0935740456 WatchSource:0}: Error finding container ea22f0290af3e7b18e02cdd6fdd692fd1c03be0741f60e9338677f0935740456: Status 404 returned error can't find the container with id ea22f0290af3e7b18e02cdd6fdd692fd1c03be0741f60e9338677f0935740456 Mar 18 16:47:37.516890 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:37.516853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jzgqd" event={"ID":"d8b3bdea-0c2e-4fc9-a142-180b0451b57b","Type":"ContainerStarted","Data":"ea22f0290af3e7b18e02cdd6fdd692fd1c03be0741f60e9338677f0935740456"} Mar 18 16:47:38.520239 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:38.520203 2578 generic.go:358] "Generic (PLEG): container finished" podID="d8b3bdea-0c2e-4fc9-a142-180b0451b57b" containerID="41fb5cd3f61ae91f1f4149841c987cfe3e0de7d4eb24b6f00a567fa349ce1f3d" exitCode=0 Mar 18 16:47:38.520239 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:38.520241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jzgqd" event={"ID":"d8b3bdea-0c2e-4fc9-a142-180b0451b57b","Type":"ContainerDied","Data":"41fb5cd3f61ae91f1f4149841c987cfe3e0de7d4eb24b6f00a567fa349ce1f3d"} Mar 18 16:47:39.254232 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:39.254204 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:47:39.524796 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:39.524718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jzgqd" event={"ID":"d8b3bdea-0c2e-4fc9-a142-180b0451b57b","Type":"ContainerStarted","Data":"b2e347a6a28a49e49838f766d7ae08cafec3357ef911230c4930e127501ff35f"} Mar 18 16:47:39.524796 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:39.524752 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jzgqd" event={"ID":"d8b3bdea-0c2e-4fc9-a142-180b0451b57b","Type":"ContainerStarted","Data":"6f151025ea2b3063de678f8998bcb4f0794ffa1a010b0b9e074f69ec016981cf"} Mar 18 16:47:39.549614 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:39.549571 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jzgqd" podStartSLOduration=2.896328025 podStartE2EDuration="3.549553556s" podCreationTimestamp="2026-03-18 16:47:36 +0000 UTC" firstStartedPulling="2026-03-18 16:47:37.160211172 +0000 UTC m=+179.744985175" lastFinishedPulling="2026-03-18 16:47:37.813436686 +0000 UTC m=+180.398210706" observedRunningTime="2026-03-18 16:47:39.549053011 +0000 UTC m=+182.133827035" watchObservedRunningTime="2026-03-18 16:47:39.549553556 +0000 UTC m=+182.134327584" Mar 18 16:47:40.515368 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.515334 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-675c6476b5-zq6kd"] Mar 18 16:47:40.518279 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.518263 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.520178 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.520155 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Mar 18 16:47:40.520632 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.520615 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Mar 18 16:47:40.520717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.520695 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Mar 18 16:47:40.520772 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.520704 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4qggbumfk3hb0\"" Mar 18 16:47:40.520772 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.520736 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-mwqfc\"" Mar 18 16:47:40.520772 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.520738 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Mar 18 16:47:40.529985 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.529964 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-675c6476b5-zq6kd"] Mar 18 16:47:40.561877 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.561850 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-client-ca-bundle\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.561987 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.561902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-metrics-server-audit-profiles\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.561987 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.561928 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z49kv\" (UniqueName: \"kubernetes.io/projected/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-kube-api-access-z49kv\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.561987 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.561954 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-secret-metrics-server-tls\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.562174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.562041 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.562174 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.562066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-audit-log\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.562266 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.562193 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-secret-metrics-server-client-certs\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.662817 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.662792 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.662895 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.662821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-audit-log\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.662895 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.662853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-secret-metrics-server-client-certs\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.662984 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.662964 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-client-ca-bundle\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.663027 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.663003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-metrics-server-audit-profiles\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.663027 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.663021 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z49kv\" (UniqueName: \"kubernetes.io/projected/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-kube-api-access-z49kv\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.663143 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.663039 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-secret-metrics-server-tls\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.663285 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.663268 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-audit-log\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.663528 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.663510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.664139 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.664109 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-metrics-server-audit-profiles\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.665421 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.665399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-client-ca-bundle\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.665531 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.665510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-secret-metrics-server-client-certs\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.665531 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.665522 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-secret-metrics-server-tls\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.670148 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.670130 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z49kv\" (UniqueName: \"kubernetes.io/projected/cf1aad5a-c827-45f3-8bd0-35b26f6885fc-kube-api-access-z49kv\") pod \"metrics-server-675c6476b5-zq6kd\" (UID: \"cf1aad5a-c827-45f3-8bd0-35b26f6885fc\") " pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.827534 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.827465 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:47:40.948723 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.948679 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-675c6476b5-zq6kd"] Mar 18 16:47:40.951398 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:47:40.951370 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf1aad5a_c827_45f3_8bd0_35b26f6885fc.slice/crio-0920a8a8f22c49b25aa0faded37c0e8beb178fc25bdd76031d043ef9a4cd0424 WatchSource:0}: Error finding container 0920a8a8f22c49b25aa0faded37c0e8beb178fc25bdd76031d043ef9a4cd0424: Status 404 returned error can't find the container with id 0920a8a8f22c49b25aa0faded37c0e8beb178fc25bdd76031d043ef9a4cd0424 Mar 18 16:47:40.986999 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.986973 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-km458"] Mar 18 16:47:40.991027 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.991013 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" Mar 18 16:47:40.992848 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.992832 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Mar 18 16:47:40.992954 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.992937 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-g4wj9\"" Mar 18 16:47:40.996820 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:40.996803 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-km458"] Mar 18 16:47:41.066425 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:41.066395 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/55549973-6078-4e9b-a42a-75ae3ea9c602-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-km458\" (UID: \"55549973-6078-4e9b-a42a-75ae3ea9c602\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" Mar 18 16:47:41.167724 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:41.167700 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/55549973-6078-4e9b-a42a-75ae3ea9c602-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-km458\" (UID: \"55549973-6078-4e9b-a42a-75ae3ea9c602\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" Mar 18 16:47:41.167826 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:47:41.167814 2578 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Mar 18 16:47:41.167874 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:47:41.167865 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55549973-6078-4e9b-a42a-75ae3ea9c602-monitoring-plugin-cert podName:55549973-6078-4e9b-a42a-75ae3ea9c602 nodeName:}" failed. No retries permitted until 2026-03-18 16:47:41.667851774 +0000 UTC m=+184.252625777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/55549973-6078-4e9b-a42a-75ae3ea9c602-monitoring-plugin-cert") pod "monitoring-plugin-6d47bdb78d-km458" (UID: "55549973-6078-4e9b-a42a-75ae3ea9c602") : secret "monitoring-plugin-cert" not found Mar 18 16:47:41.529790 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:41.529711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" event={"ID":"cf1aad5a-c827-45f3-8bd0-35b26f6885fc","Type":"ContainerStarted","Data":"0920a8a8f22c49b25aa0faded37c0e8beb178fc25bdd76031d043ef9a4cd0424"} Mar 18 16:47:41.671849 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:41.671802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/55549973-6078-4e9b-a42a-75ae3ea9c602-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-km458\" (UID: \"55549973-6078-4e9b-a42a-75ae3ea9c602\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" Mar 18 16:47:41.674675 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:41.674613 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/55549973-6078-4e9b-a42a-75ae3ea9c602-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-km458\" (UID: \"55549973-6078-4e9b-a42a-75ae3ea9c602\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" Mar 18 16:47:41.900234 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:41.900206 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" Mar 18 16:47:42.027891 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:42.027846 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-km458"] Mar 18 16:47:42.032143 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:47:42.032104 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55549973_6078_4e9b_a42a_75ae3ea9c602.slice/crio-e81a76e067476689c98a8dcef2ef5f1074e6aa29ce3ab9d3b5bc6a19c99d9968 WatchSource:0}: Error finding container e81a76e067476689c98a8dcef2ef5f1074e6aa29ce3ab9d3b5bc6a19c99d9968: Status 404 returned error can't find the container with id e81a76e067476689c98a8dcef2ef5f1074e6aa29ce3ab9d3b5bc6a19c99d9968 Mar 18 16:47:42.533410 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:42.533373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" event={"ID":"55549973-6078-4e9b-a42a-75ae3ea9c602","Type":"ContainerStarted","Data":"e81a76e067476689c98a8dcef2ef5f1074e6aa29ce3ab9d3b5bc6a19c99d9968"} Mar 18 16:47:42.534623 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:42.534605 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" event={"ID":"cf1aad5a-c827-45f3-8bd0-35b26f6885fc","Type":"ContainerStarted","Data":"d2d6d3b094d1e6bcc89abec993e0d9020ccdd6c85d3a2662a125ed5c90a44f99"} Mar 18 16:47:42.556662 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:42.556610 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" podStartSLOduration=1.031154485 podStartE2EDuration="2.55659583s" podCreationTimestamp="2026-03-18 16:47:40 +0000 UTC" firstStartedPulling="2026-03-18 16:47:40.953319704 +0000 UTC m=+183.538093710" lastFinishedPulling="2026-03-18 16:47:42.478761049 +0000 UTC m=+185.063535055" observedRunningTime="2026-03-18 16:47:42.555064117 +0000 UTC m=+185.139838154" watchObservedRunningTime="2026-03-18 16:47:42.55659583 +0000 UTC m=+185.141369857" Mar 18 16:47:44.272780 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.272734 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" podUID="80a5817b-7fb9-49f5-b276-9dc6d1d1f924" containerName="registry" containerID="cri-o://1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776" gracePeriod=30 Mar 18 16:47:44.500711 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.500690 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:47:44.540555 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.540475 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" event={"ID":"55549973-6078-4e9b-a42a-75ae3ea9c602","Type":"ContainerStarted","Data":"03190b35ca5d8c59c8ef79db5eca5961883d51b63e54a09280a05b20d7cfd327"} Mar 18 16:47:44.540716 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.540703 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" Mar 18 16:47:44.541614 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.541588 2578 generic.go:358] "Generic (PLEG): container finished" podID="80a5817b-7fb9-49f5-b276-9dc6d1d1f924" containerID="1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776" exitCode=0 Mar 18 16:47:44.541733 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.541632 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" event={"ID":"80a5817b-7fb9-49f5-b276-9dc6d1d1f924","Type":"ContainerDied","Data":"1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776"} Mar 18 16:47:44.541733 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.541657 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" event={"ID":"80a5817b-7fb9-49f5-b276-9dc6d1d1f924","Type":"ContainerDied","Data":"fd6972f18d00e45c7002e7811c0318699585a76f11d9dd6c3d59a7123bf1f93d"} Mar 18 16:47:44.541733 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.541674 2578 scope.go:117] "RemoveContainer" containerID="1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776" Mar 18 16:47:44.541733 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.541676 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-d48cdb47c-jw9jp" Mar 18 16:47:44.545356 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.545335 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" Mar 18 16:47:44.549610 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.549595 2578 scope.go:117] "RemoveContainer" containerID="1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776" Mar 18 16:47:44.549846 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:47:44.549828 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776\": container with ID starting with 1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776 not found: ID does not exist" containerID="1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776" Mar 18 16:47:44.549900 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.549856 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776"} err="failed to get container status \"1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776\": rpc error: code = NotFound desc = could not find container \"1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776\": container with ID starting with 1e7c311c7095d0938c2a2d515b77e33e7fa996ed9c19576c129b7608d5868776 not found: ID does not exist" Mar 18 16:47:44.554925 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.554889 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-km458" podStartSLOduration=2.985486291 podStartE2EDuration="4.55487957s" podCreationTimestamp="2026-03-18 16:47:40 +0000 UTC" firstStartedPulling="2026-03-18 16:47:42.0342017 +0000 UTC m=+184.618975718" lastFinishedPulling="2026-03-18 16:47:43.603594991 +0000 UTC m=+186.188368997" observedRunningTime="2026-03-18 16:47:44.554029548 +0000 UTC m=+187.138803567" watchObservedRunningTime="2026-03-18 16:47:44.55487957 +0000 UTC m=+187.139653613" Mar 18 16:47:44.596984 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.596963 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-bound-sa-token\") pod \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " Mar 18 16:47:44.597064 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.596996 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls\") pod \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " Mar 18 16:47:44.597064 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.597017 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-ca-trust-extracted\") pod \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " Mar 18 16:47:44.597064 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.597046 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-image-registry-private-configuration\") pod \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " Mar 18 16:47:44.597230 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.597085 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-certificates\") pod \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " Mar 18 16:47:44.597230 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.597172 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-installation-pull-secrets\") pod \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " Mar 18 16:47:44.597230 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.597206 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-trusted-ca\") pod \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " Mar 18 16:47:44.597364 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.597239 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfhfr\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-kube-api-access-rfhfr\") pod \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\" (UID: \"80a5817b-7fb9-49f5-b276-9dc6d1d1f924\") " Mar 18 16:47:44.597753 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.597725 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "80a5817b-7fb9-49f5-b276-9dc6d1d1f924" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:44.598045 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.598005 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "80a5817b-7fb9-49f5-b276-9dc6d1d1f924" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:44.599843 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.599802 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "80a5817b-7fb9-49f5-b276-9dc6d1d1f924" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:44.599943 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.599919 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "80a5817b-7fb9-49f5-b276-9dc6d1d1f924" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:44.600251 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.600228 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "80a5817b-7fb9-49f5-b276-9dc6d1d1f924" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:44.600363 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.600309 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "80a5817b-7fb9-49f5-b276-9dc6d1d1f924" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:44.600420 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.600388 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-kube-api-access-rfhfr" (OuterVolumeSpecName: "kube-api-access-rfhfr") pod "80a5817b-7fb9-49f5-b276-9dc6d1d1f924" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924"). InnerVolumeSpecName "kube-api-access-rfhfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:44.606585 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.606563 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "80a5817b-7fb9-49f5-b276-9dc6d1d1f924" (UID: "80a5817b-7fb9-49f5-b276-9dc6d1d1f924"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:47:44.697912 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.697884 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-certificates\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:47:44.697912 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.697907 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-installation-pull-secrets\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:47:44.697912 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.697917 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-trusted-ca\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:47:44.698088 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.697926 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rfhfr\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-kube-api-access-rfhfr\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:47:44.698088 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.697935 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-bound-sa-token\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:47:44.698088 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.697943 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-registry-tls\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:47:44.698088 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.697951 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-ca-trust-extracted\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:47:44.698088 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.697959 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/80a5817b-7fb9-49f5-b276-9dc6d1d1f924-image-registry-private-configuration\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:47:44.860036 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.860006 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-d48cdb47c-jw9jp"] Mar 18 16:47:44.862857 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:44.862836 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-d48cdb47c-jw9jp"] Mar 18 16:47:45.997245 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:45.997215 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80a5817b-7fb9-49f5-b276-9dc6d1d1f924" path="/var/lib/kubelet/pods/80a5817b-7fb9-49f5-b276-9dc6d1d1f924/volumes" Mar 18 16:47:54.902025 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.901992 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68ff6fbcb4-vrzlm"] Mar 18 16:47:54.902491 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.902248 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80a5817b-7fb9-49f5-b276-9dc6d1d1f924" containerName="registry" Mar 18 16:47:54.902491 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.902259 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="80a5817b-7fb9-49f5-b276-9dc6d1d1f924" containerName="registry" Mar 18 16:47:54.902491 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.902307 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="80a5817b-7fb9-49f5-b276-9dc6d1d1f924" containerName="registry" Mar 18 16:47:54.910513 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.910495 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:54.913019 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.912992 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-vrmw6\"" Mar 18 16:47:54.913155 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.913019 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 18 16:47:54.913155 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.913030 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 18 16:47:54.913155 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.913034 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 18 16:47:54.913155 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.913064 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 18 16:47:54.913155 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.912992 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 18 16:47:54.913155 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.913034 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 18 16:47:54.913155 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.913040 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 18 16:47:54.915677 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.915657 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68ff6fbcb4-vrzlm"] Mar 18 16:47:54.967493 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.967464 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-oauth-serving-cert\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:54.967622 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.967498 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hvpj\" (UniqueName: \"kubernetes.io/projected/14e65f3d-e513-4f91-9eb7-e8606668c8a1-kube-api-access-6hvpj\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:54.967622 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.967526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-oauth-config\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:54.967622 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.967597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-service-ca\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:54.967741 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.967631 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-config\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:54.967741 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:54.967659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-serving-cert\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.068784 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.068752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-oauth-serving-cert\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.068918 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.068786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hvpj\" (UniqueName: \"kubernetes.io/projected/14e65f3d-e513-4f91-9eb7-e8606668c8a1-kube-api-access-6hvpj\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.068918 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.068815 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-oauth-config\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.068918 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.068848 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-service-ca\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.068918 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.068884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-config\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.069149 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.068922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-serving-cert\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.069598 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.069528 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-service-ca\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.069598 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.069534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-config\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.071501 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.071477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-oauth-config\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.071598 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.071484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-serving-cert\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.074983 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.074965 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-oauth-serving-cert\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.075950 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.075933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hvpj\" (UniqueName: \"kubernetes.io/projected/14e65f3d-e513-4f91-9eb7-e8606668c8a1-kube-api-access-6hvpj\") pod \"console-68ff6fbcb4-vrzlm\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.219727 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.219654 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:47:55.344532 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.344502 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68ff6fbcb4-vrzlm"] Mar 18 16:47:55.347710 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:47:55.347677 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14e65f3d_e513_4f91_9eb7_e8606668c8a1.slice/crio-59279cca4965530bd2e9e0eb4dba2123e2d8519d0d15594298062d9e258f0133 WatchSource:0}: Error finding container 59279cca4965530bd2e9e0eb4dba2123e2d8519d0d15594298062d9e258f0133: Status 404 returned error can't find the container with id 59279cca4965530bd2e9e0eb4dba2123e2d8519d0d15594298062d9e258f0133 Mar 18 16:47:55.572855 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:55.572777 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ff6fbcb4-vrzlm" event={"ID":"14e65f3d-e513-4f91-9eb7-e8606668c8a1","Type":"ContainerStarted","Data":"59279cca4965530bd2e9e0eb4dba2123e2d8519d0d15594298062d9e258f0133"} Mar 18 16:47:58.581117 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:58.581042 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ff6fbcb4-vrzlm" event={"ID":"14e65f3d-e513-4f91-9eb7-e8606668c8a1","Type":"ContainerStarted","Data":"0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94"} Mar 18 16:47:58.597783 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:47:58.597742 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68ff6fbcb4-vrzlm" podStartSLOduration=2.055326637 podStartE2EDuration="4.597729121s" podCreationTimestamp="2026-03-18 16:47:54 +0000 UTC" firstStartedPulling="2026-03-18 16:47:55.350041994 +0000 UTC m=+197.934815997" lastFinishedPulling="2026-03-18 16:47:57.892444477 +0000 UTC m=+200.477218481" observedRunningTime="2026-03-18 16:47:58.596681957 +0000 UTC m=+201.181455993" watchObservedRunningTime="2026-03-18 16:47:58.597729121 +0000 UTC m=+201.182503146" Mar 18 16:48:00.828458 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:00.828424 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:48:00.828458 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:00.828466 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:48:02.478470 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.478432 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d45545b58-nff4w"] Mar 18 16:48:02.483023 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.482998 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.489041 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.489018 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 18 16:48:02.490074 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.490052 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d45545b58-nff4w"] Mar 18 16:48:02.525335 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.525309 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-config\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.525444 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.525341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-oauth-config\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.525444 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.525373 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rn8w\" (UniqueName: \"kubernetes.io/projected/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-kube-api-access-2rn8w\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.525533 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.525450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-trusted-ca-bundle\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.525533 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.525486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-serving-cert\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.525533 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.525502 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-oauth-serving-cert\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.525533 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.525520 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-service-ca\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.626284 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.626256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-trusted-ca-bundle\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.626457 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.626290 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-serving-cert\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.626457 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.626306 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-oauth-serving-cert\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.626457 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.626323 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-service-ca\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.626612 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.626473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-config\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.626612 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.626506 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-oauth-config\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.626612 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.626554 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rn8w\" (UniqueName: \"kubernetes.io/projected/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-kube-api-access-2rn8w\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.627146 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.627058 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-oauth-serving-cert\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.627352 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.627203 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-service-ca\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.627352 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.627242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-config\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.627553 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.627534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-trusted-ca-bundle\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.628992 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.628969 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-oauth-config\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.629086 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.628976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-serving-cert\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.635977 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.635960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rn8w\" (UniqueName: \"kubernetes.io/projected/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-kube-api-access-2rn8w\") pod \"console-7d45545b58-nff4w\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.791980 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.791899 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:02.909841 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:02.909812 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d45545b58-nff4w"] Mar 18 16:48:02.912569 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:48:02.912541 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c311900_9223_4d5c_8e00_6b3ecf8be5c8.slice/crio-a075c3419ee5253ced68d28fbd79f9c7e3ce4fa77954dd7fec67e6e01cbc49b9 WatchSource:0}: Error finding container a075c3419ee5253ced68d28fbd79f9c7e3ce4fa77954dd7fec67e6e01cbc49b9: Status 404 returned error can't find the container with id a075c3419ee5253ced68d28fbd79f9c7e3ce4fa77954dd7fec67e6e01cbc49b9 Mar 18 16:48:03.595318 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:03.595284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d45545b58-nff4w" event={"ID":"9c311900-9223-4d5c-8e00-6b3ecf8be5c8","Type":"ContainerStarted","Data":"6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba"} Mar 18 16:48:03.595318 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:03.595320 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d45545b58-nff4w" event={"ID":"9c311900-9223-4d5c-8e00-6b3ecf8be5c8","Type":"ContainerStarted","Data":"a075c3419ee5253ced68d28fbd79f9c7e3ce4fa77954dd7fec67e6e01cbc49b9"} Mar 18 16:48:03.612857 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:03.612814 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d45545b58-nff4w" podStartSLOduration=1.612801304 podStartE2EDuration="1.612801304s" podCreationTimestamp="2026-03-18 16:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:48:03.611991654 +0000 UTC m=+206.196765679" watchObservedRunningTime="2026-03-18 16:48:03.612801304 +0000 UTC m=+206.197575329" Mar 18 16:48:05.219792 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:05.219760 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:48:05.220182 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:05.219801 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:48:05.224428 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:05.224408 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:48:05.605391 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:05.605320 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:48:12.792815 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:12.792784 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:12.793212 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:12.792858 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:12.797164 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:12.797146 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:13.624824 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:13.624799 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:48:13.669344 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:13.669304 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68ff6fbcb4-vrzlm"] Mar 18 16:48:20.833163 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:20.833130 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:48:20.836999 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:20.836976 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-675c6476b5-zq6kd" Mar 18 16:48:22.645981 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:22.645946 2578 generic.go:358] "Generic (PLEG): container finished" podID="143d5104-122a-4bd9-ac1e-35fce758029a" containerID="843d98527978f02cf50344d6c870a766293dda341bd800ab8e971c4f7e0ee90c" exitCode=0 Mar 18 16:48:22.646353 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:22.645986 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" event={"ID":"143d5104-122a-4bd9-ac1e-35fce758029a","Type":"ContainerDied","Data":"843d98527978f02cf50344d6c870a766293dda341bd800ab8e971c4f7e0ee90c"} Mar 18 16:48:22.646353 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:22.646277 2578 scope.go:117] "RemoveContainer" containerID="843d98527978f02cf50344d6c870a766293dda341bd800ab8e971c4f7e0ee90c" Mar 18 16:48:23.650447 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:23.650415 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-bv4qv" event={"ID":"143d5104-122a-4bd9-ac1e-35fce758029a","Type":"ContainerStarted","Data":"f06a719c5b57b66c09befcc7e3580f79891be89b991e62cfbc6d859263fc6e37"} Mar 18 16:48:38.688480 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:38.688409 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68ff6fbcb4-vrzlm" podUID="14e65f3d-e513-4f91-9eb7-e8606668c8a1" containerName="console" containerID="cri-o://0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94" gracePeriod=15 Mar 18 16:48:38.943727 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:38.943668 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68ff6fbcb4-vrzlm_14e65f3d-e513-4f91-9eb7-e8606668c8a1/console/0.log" Mar 18 16:48:38.943895 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:38.943733 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:48:39.127449 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.127409 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-config\") pod \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " Mar 18 16:48:39.127449 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.127450 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-oauth-config\") pod \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " Mar 18 16:48:39.127691 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.127489 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-service-ca\") pod \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " Mar 18 16:48:39.127691 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.127509 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-oauth-serving-cert\") pod \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " Mar 18 16:48:39.127691 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.127622 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-serving-cert\") pod \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " Mar 18 16:48:39.127835 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.127690 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hvpj\" (UniqueName: \"kubernetes.io/projected/14e65f3d-e513-4f91-9eb7-e8606668c8a1-kube-api-access-6hvpj\") pod \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\" (UID: \"14e65f3d-e513-4f91-9eb7-e8606668c8a1\") " Mar 18 16:48:39.127906 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.127872 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-service-ca" (OuterVolumeSpecName: "service-ca") pod "14e65f3d-e513-4f91-9eb7-e8606668c8a1" (UID: "14e65f3d-e513-4f91-9eb7-e8606668c8a1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:39.127952 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.127899 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-config" (OuterVolumeSpecName: "console-config") pod "14e65f3d-e513-4f91-9eb7-e8606668c8a1" (UID: "14e65f3d-e513-4f91-9eb7-e8606668c8a1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:39.127952 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.127918 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "14e65f3d-e513-4f91-9eb7-e8606668c8a1" (UID: "14e65f3d-e513-4f91-9eb7-e8606668c8a1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:39.129970 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.129946 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "14e65f3d-e513-4f91-9eb7-e8606668c8a1" (UID: "14e65f3d-e513-4f91-9eb7-e8606668c8a1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:39.130038 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.129963 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "14e65f3d-e513-4f91-9eb7-e8606668c8a1" (UID: "14e65f3d-e513-4f91-9eb7-e8606668c8a1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:39.130038 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.129971 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e65f3d-e513-4f91-9eb7-e8606668c8a1-kube-api-access-6hvpj" (OuterVolumeSpecName: "kube-api-access-6hvpj") pod "14e65f3d-e513-4f91-9eb7-e8606668c8a1" (UID: "14e65f3d-e513-4f91-9eb7-e8606668c8a1"). InnerVolumeSpecName "kube-api-access-6hvpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:39.229324 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.229225 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hvpj\" (UniqueName: \"kubernetes.io/projected/14e65f3d-e513-4f91-9eb7-e8606668c8a1-kube-api-access-6hvpj\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:48:39.229324 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.229263 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:48:39.229324 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.229276 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-oauth-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:48:39.229324 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.229288 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-service-ca\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:48:39.229324 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.229301 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/14e65f3d-e513-4f91-9eb7-e8606668c8a1-oauth-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:48:39.229324 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.229312 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/14e65f3d-e513-4f91-9eb7-e8606668c8a1-console-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:48:39.700725 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.700700 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68ff6fbcb4-vrzlm_14e65f3d-e513-4f91-9eb7-e8606668c8a1/console/0.log" Mar 18 16:48:39.701119 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.700739 2578 generic.go:358] "Generic (PLEG): container finished" podID="14e65f3d-e513-4f91-9eb7-e8606668c8a1" containerID="0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94" exitCode=2 Mar 18 16:48:39.701119 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.700798 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ff6fbcb4-vrzlm" event={"ID":"14e65f3d-e513-4f91-9eb7-e8606668c8a1","Type":"ContainerDied","Data":"0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94"} Mar 18 16:48:39.701119 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.700802 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68ff6fbcb4-vrzlm" Mar 18 16:48:39.701119 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.700824 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68ff6fbcb4-vrzlm" event={"ID":"14e65f3d-e513-4f91-9eb7-e8606668c8a1","Type":"ContainerDied","Data":"59279cca4965530bd2e9e0eb4dba2123e2d8519d0d15594298062d9e258f0133"} Mar 18 16:48:39.701119 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.700839 2578 scope.go:117] "RemoveContainer" containerID="0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94" Mar 18 16:48:39.709769 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.709754 2578 scope.go:117] "RemoveContainer" containerID="0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94" Mar 18 16:48:39.710003 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:48:39.709985 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94\": container with ID starting with 0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94 not found: ID does not exist" containerID="0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94" Mar 18 16:48:39.710059 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.710012 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94"} err="failed to get container status \"0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94\": rpc error: code = NotFound desc = could not find container \"0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94\": container with ID starting with 0fded288daf1ea9e0446d71fa32a515132803db21e67917f3ad625980c8bdc94 not found: ID does not exist" Mar 18 16:48:39.721105 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.721068 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68ff6fbcb4-vrzlm"] Mar 18 16:48:39.725283 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.725263 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68ff6fbcb4-vrzlm"] Mar 18 16:48:39.996690 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:39.996619 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e65f3d-e513-4f91-9eb7-e8606668c8a1" path="/var/lib/kubelet/pods/14e65f3d-e513-4f91-9eb7-e8606668c8a1/volumes" Mar 18 16:48:49.713857 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:49.713794 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:48:49.716345 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:49.716320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/474e4d0f-dbfc-41a4-ad8f-fcada6a1b880-metrics-certs\") pod \"network-metrics-daemon-jcvjp\" (UID: \"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880\") " pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:48:49.897951 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:49.897919 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-w8h72\"" Mar 18 16:48:49.905842 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:49.905824 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jcvjp" Mar 18 16:48:50.027237 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:50.027203 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jcvjp"] Mar 18 16:48:50.030725 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:48:50.030696 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod474e4d0f_dbfc_41a4_ad8f_fcada6a1b880.slice/crio-9f6287ccbd22131d8c5b849971a36b1246fd2f5ff91ad40df5bb6635f6f7535b WatchSource:0}: Error finding container 9f6287ccbd22131d8c5b849971a36b1246fd2f5ff91ad40df5bb6635f6f7535b: Status 404 returned error can't find the container with id 9f6287ccbd22131d8c5b849971a36b1246fd2f5ff91ad40df5bb6635f6f7535b Mar 18 16:48:50.732252 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:50.732217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jcvjp" event={"ID":"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880","Type":"ContainerStarted","Data":"9f6287ccbd22131d8c5b849971a36b1246fd2f5ff91ad40df5bb6635f6f7535b"} Mar 18 16:48:51.736257 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:51.736218 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jcvjp" event={"ID":"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880","Type":"ContainerStarted","Data":"33bfcbac238f75e47e68510ee5d9bc9e59d014f13d32674d0d6c2479a545519c"} Mar 18 16:48:51.736257 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:51.736261 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jcvjp" event={"ID":"474e4d0f-dbfc-41a4-ad8f-fcada6a1b880","Type":"ContainerStarted","Data":"6db4b657f2bb2bf769fe0d34ea074fb384198563df4b887496e9098fcc2659f8"} Mar 18 16:48:51.752989 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:48:51.752939 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jcvjp" podStartSLOduration=252.815884799 podStartE2EDuration="4m13.752926817s" podCreationTimestamp="2026-03-18 16:44:38 +0000 UTC" firstStartedPulling="2026-03-18 16:48:50.032876045 +0000 UTC m=+252.617650052" lastFinishedPulling="2026-03-18 16:48:50.969918053 +0000 UTC m=+253.554692070" observedRunningTime="2026-03-18 16:48:51.752790157 +0000 UTC m=+254.337564182" watchObservedRunningTime="2026-03-18 16:48:51.752926817 +0000 UTC m=+254.337700877" Mar 18 16:49:00.560726 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.560647 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-656bbb49d4-xw74z"] Mar 18 16:49:00.561071 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.560904 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14e65f3d-e513-4f91-9eb7-e8606668c8a1" containerName="console" Mar 18 16:49:00.561071 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.560915 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e65f3d-e513-4f91-9eb7-e8606668c8a1" containerName="console" Mar 18 16:49:00.561071 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.560963 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="14e65f3d-e513-4f91-9eb7-e8606668c8a1" containerName="console" Mar 18 16:49:00.562993 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.562978 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.566020 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.565997 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Mar 18 16:49:00.566692 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.566670 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Mar 18 16:49:00.567149 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.567132 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Mar 18 16:49:00.567630 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.567610 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Mar 18 16:49:00.568186 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.568167 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Mar 18 16:49:00.570177 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.570160 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-jff76\"" Mar 18 16:49:00.577016 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.576997 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Mar 18 16:49:00.588569 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.588548 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-656bbb49d4-xw74z"] Mar 18 16:49:00.593576 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.593559 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-telemeter-client-tls\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.593659 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.593587 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-federate-client-tls\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.593659 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.593619 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.593729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.593676 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-secret-telemeter-client\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.593729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.593705 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4df74c11-4f50-4513-944d-19379b1f4184-serving-certs-ca-bundle\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.593794 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.593777 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4df74c11-4f50-4513-944d-19379b1f4184-telemeter-trusted-ca-bundle\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.593830 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.593818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4df74c11-4f50-4513-944d-19379b1f4184-metrics-client-ca\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.593873 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.593858 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqr7v\" (UniqueName: \"kubernetes.io/projected/4df74c11-4f50-4513-944d-19379b1f4184-kube-api-access-cqr7v\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.694476 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.694437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.694476 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.694480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-secret-telemeter-client\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.694699 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.694502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4df74c11-4f50-4513-944d-19379b1f4184-serving-certs-ca-bundle\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.694699 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.694558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4df74c11-4f50-4513-944d-19379b1f4184-telemeter-trusted-ca-bundle\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.694699 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.694588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4df74c11-4f50-4513-944d-19379b1f4184-metrics-client-ca\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.694699 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.694614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqr7v\" (UniqueName: \"kubernetes.io/projected/4df74c11-4f50-4513-944d-19379b1f4184-kube-api-access-cqr7v\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.694901 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.694763 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-telemeter-client-tls\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.694901 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.694825 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-federate-client-tls\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.695323 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.695296 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4df74c11-4f50-4513-944d-19379b1f4184-serving-certs-ca-bundle\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.695573 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.695313 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4df74c11-4f50-4513-944d-19379b1f4184-metrics-client-ca\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.695573 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.695608 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4df74c11-4f50-4513-944d-19379b1f4184-telemeter-trusted-ca-bundle\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.697077 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.697055 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.697469 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.697450 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-federate-client-tls\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.697811 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.697792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-telemeter-client-tls\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.697847 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.697792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4df74c11-4f50-4513-944d-19379b1f4184-secret-telemeter-client\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.702834 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.702816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqr7v\" (UniqueName: \"kubernetes.io/projected/4df74c11-4f50-4513-944d-19379b1f4184-kube-api-access-cqr7v\") pod \"telemeter-client-656bbb49d4-xw74z\" (UID: \"4df74c11-4f50-4513-944d-19379b1f4184\") " pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.872028 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.871954 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" Mar 18 16:49:00.992543 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:00.992509 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-656bbb49d4-xw74z"] Mar 18 16:49:00.996203 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:49:00.996176 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4df74c11_4f50_4513_944d_19379b1f4184.slice/crio-e4fe2231020260cdb51000444594e2da3a544cbc4f6f3b046d5ae5542da53c81 WatchSource:0}: Error finding container e4fe2231020260cdb51000444594e2da3a544cbc4f6f3b046d5ae5542da53c81: Status 404 returned error can't find the container with id e4fe2231020260cdb51000444594e2da3a544cbc4f6f3b046d5ae5542da53c81 Mar 18 16:49:01.762362 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:01.762322 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" event={"ID":"4df74c11-4f50-4513-944d-19379b1f4184","Type":"ContainerStarted","Data":"e4fe2231020260cdb51000444594e2da3a544cbc4f6f3b046d5ae5542da53c81"} Mar 18 16:49:03.770199 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:03.770154 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" event={"ID":"4df74c11-4f50-4513-944d-19379b1f4184","Type":"ContainerStarted","Data":"0b5bec4e03e9c7005a94576b0dd6b6ad31845d7a0b8696d81d65c8ba7c6d2bdf"} Mar 18 16:49:04.774249 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:04.774208 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" event={"ID":"4df74c11-4f50-4513-944d-19379b1f4184","Type":"ContainerStarted","Data":"a68157f8d7b244a86bf10dff78171f72ab3e980cf8bb06b772008ed8fce79e7f"} Mar 18 16:49:04.774249 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:04.774248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" event={"ID":"4df74c11-4f50-4513-944d-19379b1f4184","Type":"ContainerStarted","Data":"6315d69bfec49c52ddc615323c3e05ae3d0def598bfe8b222452bae1f95ba427"} Mar 18 16:49:04.795076 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:04.795015 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-656bbb49d4-xw74z" podStartSLOduration=1.7708021760000001 podStartE2EDuration="4.794998898s" podCreationTimestamp="2026-03-18 16:49:00 +0000 UTC" firstStartedPulling="2026-03-18 16:49:00.997908506 +0000 UTC m=+263.582682511" lastFinishedPulling="2026-03-18 16:49:04.02210521 +0000 UTC m=+266.606879233" observedRunningTime="2026-03-18 16:49:04.794319984 +0000 UTC m=+267.379094009" watchObservedRunningTime="2026-03-18 16:49:04.794998898 +0000 UTC m=+267.379772924" Mar 18 16:49:05.408695 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.408658 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d495f8cc9-mdmhb"] Mar 18 16:49:05.410644 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.410629 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.423813 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.423790 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d495f8cc9-mdmhb"] Mar 18 16:49:05.532501 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.532470 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-oauth-serving-cert\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.532501 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.532514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-oauth-config\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.532719 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.532536 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcdn2\" (UniqueName: \"kubernetes.io/projected/97c5fe9c-9531-416c-9e82-bd4652596c45-kube-api-access-tcdn2\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.532719 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.532609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-serving-cert\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.532719 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.532652 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-console-config\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.532719 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.532678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-trusted-ca-bundle\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.532719 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.532699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-service-ca\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.633842 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.633805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-oauth-config\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.634032 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.633847 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcdn2\" (UniqueName: \"kubernetes.io/projected/97c5fe9c-9531-416c-9e82-bd4652596c45-kube-api-access-tcdn2\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.634032 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.633876 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-serving-cert\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.634032 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.633907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-console-config\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.634032 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.633932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-trusted-ca-bundle\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.634278 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.634065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-service-ca\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.634278 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.634153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-oauth-serving-cert\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.634757 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.634720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-console-config\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.634757 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.634745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-service-ca\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.634946 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.634866 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-trusted-ca-bundle\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.634946 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.634885 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-oauth-serving-cert\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.636542 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.636520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-oauth-config\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.637177 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.637159 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-serving-cert\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.641249 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.641227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcdn2\" (UniqueName: \"kubernetes.io/projected/97c5fe9c-9531-416c-9e82-bd4652596c45-kube-api-access-tcdn2\") pod \"console-d495f8cc9-mdmhb\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.719380 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.719293 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:05.840807 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:05.840777 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d495f8cc9-mdmhb"] Mar 18 16:49:05.843933 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:49:05.843907 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c5fe9c_9531_416c_9e82_bd4652596c45.slice/crio-8cea9e4d7cb727df7d72c2f9129f15321eded3d18ebb744868176ef1ab933903 WatchSource:0}: Error finding container 8cea9e4d7cb727df7d72c2f9129f15321eded3d18ebb744868176ef1ab933903: Status 404 returned error can't find the container with id 8cea9e4d7cb727df7d72c2f9129f15321eded3d18ebb744868176ef1ab933903 Mar 18 16:49:06.781924 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:06.781884 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d495f8cc9-mdmhb" event={"ID":"97c5fe9c-9531-416c-9e82-bd4652596c45","Type":"ContainerStarted","Data":"f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7"} Mar 18 16:49:06.781924 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:06.781923 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d495f8cc9-mdmhb" event={"ID":"97c5fe9c-9531-416c-9e82-bd4652596c45","Type":"ContainerStarted","Data":"8cea9e4d7cb727df7d72c2f9129f15321eded3d18ebb744868176ef1ab933903"} Mar 18 16:49:06.799141 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:06.799074 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d495f8cc9-mdmhb" podStartSLOduration=1.799059686 podStartE2EDuration="1.799059686s" podCreationTimestamp="2026-03-18 16:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:49:06.798341773 +0000 UTC m=+269.383115812" watchObservedRunningTime="2026-03-18 16:49:06.799059686 +0000 UTC m=+269.383833712" Mar 18 16:49:15.720495 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:15.720456 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:15.720495 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:15.720496 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:15.725260 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:15.725240 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:15.811432 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:15.811401 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:49:15.855779 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:15.855750 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d45545b58-nff4w"] Mar 18 16:49:37.862082 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:37.862052 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-jh7zd_73e43277-038f-4657-95b6-addae5fb597c/cluster-monitoring-operator/0.log" Mar 18 16:49:37.862554 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:37.862540 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-jh7zd_73e43277-038f-4657-95b6-addae5fb597c/cluster-monitoring-operator/0.log" Mar 18 16:49:37.868961 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:37.868934 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-acl-logging/0.log" Mar 18 16:49:37.869513 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:37.869492 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-acl-logging/0.log" Mar 18 16:49:37.875003 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:37.874980 2578 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:49:40.875249 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:40.875176 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d45545b58-nff4w" podUID="9c311900-9223-4d5c-8e00-6b3ecf8be5c8" containerName="console" containerID="cri-o://6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba" gracePeriod=15 Mar 18 16:49:41.106686 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.106665 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d45545b58-nff4w_9c311900-9223-4d5c-8e00-6b3ecf8be5c8/console/0.log" Mar 18 16:49:41.106809 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.106725 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:49:41.199501 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.199471 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-config\") pod \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " Mar 18 16:49:41.199673 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.199571 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-trusted-ca-bundle\") pod \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " Mar 18 16:49:41.199673 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.199658 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-service-ca\") pod \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " Mar 18 16:49:41.199767 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.199694 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-oauth-serving-cert\") pod \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " Mar 18 16:49:41.199767 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.199719 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rn8w\" (UniqueName: \"kubernetes.io/projected/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-kube-api-access-2rn8w\") pod \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " Mar 18 16:49:41.199767 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.199747 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-serving-cert\") pod \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " Mar 18 16:49:41.199911 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.199821 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-oauth-config\") pod \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\" (UID: \"9c311900-9223-4d5c-8e00-6b3ecf8be5c8\") " Mar 18 16:49:41.199911 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.199845 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9c311900-9223-4d5c-8e00-6b3ecf8be5c8" (UID: "9c311900-9223-4d5c-8e00-6b3ecf8be5c8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:41.200069 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.200031 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-config" (OuterVolumeSpecName: "console-config") pod "9c311900-9223-4d5c-8e00-6b3ecf8be5c8" (UID: "9c311900-9223-4d5c-8e00-6b3ecf8be5c8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:41.200069 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.200041 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-trusted-ca-bundle\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:49:41.200266 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.200163 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9c311900-9223-4d5c-8e00-6b3ecf8be5c8" (UID: "9c311900-9223-4d5c-8e00-6b3ecf8be5c8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:41.200266 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.200182 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-service-ca" (OuterVolumeSpecName: "service-ca") pod "9c311900-9223-4d5c-8e00-6b3ecf8be5c8" (UID: "9c311900-9223-4d5c-8e00-6b3ecf8be5c8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:49:41.202086 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.202064 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-kube-api-access-2rn8w" (OuterVolumeSpecName: "kube-api-access-2rn8w") pod "9c311900-9223-4d5c-8e00-6b3ecf8be5c8" (UID: "9c311900-9223-4d5c-8e00-6b3ecf8be5c8"). InnerVolumeSpecName "kube-api-access-2rn8w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:49:41.202165 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.202113 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9c311900-9223-4d5c-8e00-6b3ecf8be5c8" (UID: "9c311900-9223-4d5c-8e00-6b3ecf8be5c8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:41.202202 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.202165 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9c311900-9223-4d5c-8e00-6b3ecf8be5c8" (UID: "9c311900-9223-4d5c-8e00-6b3ecf8be5c8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:49:41.300858 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.300821 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-oauth-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:49:41.300858 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.300854 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:49:41.300858 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.300863 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-service-ca\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:49:41.301119 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.300872 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-oauth-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:49:41.301119 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.300881 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2rn8w\" (UniqueName: \"kubernetes.io/projected/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-kube-api-access-2rn8w\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:49:41.301119 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.300892 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c311900-9223-4d5c-8e00-6b3ecf8be5c8-console-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:49:41.881572 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.881546 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d45545b58-nff4w_9c311900-9223-4d5c-8e00-6b3ecf8be5c8/console/0.log" Mar 18 16:49:41.881956 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.881588 2578 generic.go:358] "Generic (PLEG): container finished" podID="9c311900-9223-4d5c-8e00-6b3ecf8be5c8" containerID="6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba" exitCode=2 Mar 18 16:49:41.881956 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.881635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d45545b58-nff4w" event={"ID":"9c311900-9223-4d5c-8e00-6b3ecf8be5c8","Type":"ContainerDied","Data":"6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba"} Mar 18 16:49:41.881956 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.881661 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d45545b58-nff4w" Mar 18 16:49:41.881956 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.881673 2578 scope.go:117] "RemoveContainer" containerID="6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba" Mar 18 16:49:41.881956 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.881662 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d45545b58-nff4w" event={"ID":"9c311900-9223-4d5c-8e00-6b3ecf8be5c8","Type":"ContainerDied","Data":"a075c3419ee5253ced68d28fbd79f9c7e3ce4fa77954dd7fec67e6e01cbc49b9"} Mar 18 16:49:41.890127 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.890086 2578 scope.go:117] "RemoveContainer" containerID="6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba" Mar 18 16:49:41.890424 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:49:41.890406 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba\": container with ID starting with 6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba not found: ID does not exist" containerID="6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba" Mar 18 16:49:41.890473 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.890434 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba"} err="failed to get container status \"6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba\": rpc error: code = NotFound desc = could not find container \"6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba\": container with ID starting with 6dbf011f4319034f721a54a8f7ea6678b45d42bfd39d068d2cd102a829e91eba not found: ID does not exist" Mar 18 16:49:41.899375 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.899325 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d45545b58-nff4w"] Mar 18 16:49:41.901690 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.901658 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d45545b58-nff4w"] Mar 18 16:49:41.996978 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:49:41.996944 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c311900-9223-4d5c-8e00-6b3ecf8be5c8" path="/var/lib/kubelet/pods/9c311900-9223-4d5c-8e00-6b3ecf8be5c8/volumes" Mar 18 16:50:14.041824 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.041787 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78848b98d5-2zhnt"] Mar 18 16:50:14.042304 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.042043 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c311900-9223-4d5c-8e00-6b3ecf8be5c8" containerName="console" Mar 18 16:50:14.042304 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.042054 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c311900-9223-4d5c-8e00-6b3ecf8be5c8" containerName="console" Mar 18 16:50:14.042304 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.042131 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c311900-9223-4d5c-8e00-6b3ecf8be5c8" containerName="console" Mar 18 16:50:14.044811 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.044792 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.053811 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.053789 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78848b98d5-2zhnt"] Mar 18 16:50:14.135352 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.135316 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-service-ca\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.135527 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.135352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-trusted-ca-bundle\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.135527 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.135407 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-oauth-config\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.135527 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.135430 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-oauth-serving-cert\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.135527 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.135456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-config\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.135527 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.135488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnfk4\" (UniqueName: \"kubernetes.io/projected/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-kube-api-access-vnfk4\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.135527 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.135505 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-serving-cert\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.236813 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.236776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-service-ca\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.236813 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.236816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-trusted-ca-bundle\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.237072 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.236836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-oauth-config\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.237072 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.236859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-oauth-serving-cert\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.237072 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.236996 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-config\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.237072 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.237055 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnfk4\" (UniqueName: \"kubernetes.io/projected/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-kube-api-access-vnfk4\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.237312 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.237082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-serving-cert\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.237764 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.237735 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-service-ca\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.237881 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.237772 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-oauth-serving-cert\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.237881 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.237803 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-config\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.237881 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.237813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-trusted-ca-bundle\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.239957 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.239929 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-oauth-config\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.240044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.240009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-serving-cert\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.244980 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.244957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnfk4\" (UniqueName: \"kubernetes.io/projected/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-kube-api-access-vnfk4\") pod \"console-78848b98d5-2zhnt\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.354087 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.354010 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:14.474626 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.474595 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78848b98d5-2zhnt"] Mar 18 16:50:14.477509 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:50:14.477472 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9b688fb_84bc_47a9_90e7_c46bad6b1df4.slice/crio-4eb10dba660b06af11b0bef67dfe6e9ff8b47c366d9f2e5762f4921aaa3658d1 WatchSource:0}: Error finding container 4eb10dba660b06af11b0bef67dfe6e9ff8b47c366d9f2e5762f4921aaa3658d1: Status 404 returned error can't find the container with id 4eb10dba660b06af11b0bef67dfe6e9ff8b47c366d9f2e5762f4921aaa3658d1 Mar 18 16:50:14.479205 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.479189 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:50:14.973028 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.972983 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78848b98d5-2zhnt" event={"ID":"b9b688fb-84bc-47a9-90e7-c46bad6b1df4","Type":"ContainerStarted","Data":"8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053"} Mar 18 16:50:14.973028 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.973032 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78848b98d5-2zhnt" event={"ID":"b9b688fb-84bc-47a9-90e7-c46bad6b1df4","Type":"ContainerStarted","Data":"4eb10dba660b06af11b0bef67dfe6e9ff8b47c366d9f2e5762f4921aaa3658d1"} Mar 18 16:50:14.991762 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:14.991707 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78848b98d5-2zhnt" podStartSLOduration=0.99169091 podStartE2EDuration="991.69091ms" podCreationTimestamp="2026-03-18 16:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:50:14.990220514 +0000 UTC m=+337.574994553" watchObservedRunningTime="2026-03-18 16:50:14.99169091 +0000 UTC m=+337.576464937" Mar 18 16:50:24.354481 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:24.354442 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:24.354481 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:24.354486 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:24.359213 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:24.359185 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:25.002523 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:25.002497 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:50:25.047523 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:25.047495 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d495f8cc9-mdmhb"] Mar 18 16:50:32.738161 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.738127 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-q8vrp"] Mar 18 16:50:32.741278 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.741256 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:32.743331 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.743314 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:50:32.748885 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.748865 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q8vrp"] Mar 18 16:50:32.763884 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.763860 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48b70cf4-1041-40e2-aac2-2997c4e8585d-original-pull-secret\") pod \"global-pull-secret-syncer-q8vrp\" (UID: \"48b70cf4-1041-40e2-aac2-2997c4e8585d\") " pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:32.763974 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.763903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/48b70cf4-1041-40e2-aac2-2997c4e8585d-kubelet-config\") pod \"global-pull-secret-syncer-q8vrp\" (UID: \"48b70cf4-1041-40e2-aac2-2997c4e8585d\") " pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:32.763974 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.763926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/48b70cf4-1041-40e2-aac2-2997c4e8585d-dbus\") pod \"global-pull-secret-syncer-q8vrp\" (UID: \"48b70cf4-1041-40e2-aac2-2997c4e8585d\") " pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:32.865024 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.864996 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48b70cf4-1041-40e2-aac2-2997c4e8585d-original-pull-secret\") pod \"global-pull-secret-syncer-q8vrp\" (UID: \"48b70cf4-1041-40e2-aac2-2997c4e8585d\") " pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:32.865156 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.865040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/48b70cf4-1041-40e2-aac2-2997c4e8585d-kubelet-config\") pod \"global-pull-secret-syncer-q8vrp\" (UID: \"48b70cf4-1041-40e2-aac2-2997c4e8585d\") " pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:32.865156 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.865077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/48b70cf4-1041-40e2-aac2-2997c4e8585d-dbus\") pod \"global-pull-secret-syncer-q8vrp\" (UID: \"48b70cf4-1041-40e2-aac2-2997c4e8585d\") " pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:32.865249 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.865157 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/48b70cf4-1041-40e2-aac2-2997c4e8585d-kubelet-config\") pod \"global-pull-secret-syncer-q8vrp\" (UID: \"48b70cf4-1041-40e2-aac2-2997c4e8585d\") " pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:32.865249 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.865224 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/48b70cf4-1041-40e2-aac2-2997c4e8585d-dbus\") pod \"global-pull-secret-syncer-q8vrp\" (UID: \"48b70cf4-1041-40e2-aac2-2997c4e8585d\") " pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:32.867386 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:32.867370 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/48b70cf4-1041-40e2-aac2-2997c4e8585d-original-pull-secret\") pod \"global-pull-secret-syncer-q8vrp\" (UID: \"48b70cf4-1041-40e2-aac2-2997c4e8585d\") " pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:33.051754 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:33.051673 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-q8vrp" Mar 18 16:50:33.165317 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:33.165285 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-q8vrp"] Mar 18 16:50:33.169162 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:50:33.169132 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48b70cf4_1041_40e2_aac2_2997c4e8585d.slice/crio-f55c0eeacb5d3439465206f6592055d32b58063a3eb7c80608dbf743817f4f3f WatchSource:0}: Error finding container f55c0eeacb5d3439465206f6592055d32b58063a3eb7c80608dbf743817f4f3f: Status 404 returned error can't find the container with id f55c0eeacb5d3439465206f6592055d32b58063a3eb7c80608dbf743817f4f3f Mar 18 16:50:34.023211 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:34.023163 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q8vrp" event={"ID":"48b70cf4-1041-40e2-aac2-2997c4e8585d","Type":"ContainerStarted","Data":"f55c0eeacb5d3439465206f6592055d32b58063a3eb7c80608dbf743817f4f3f"} Mar 18 16:50:38.035071 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:38.035031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-q8vrp" event={"ID":"48b70cf4-1041-40e2-aac2-2997c4e8585d","Type":"ContainerStarted","Data":"03c591faed0e537b278838300463d25d1207b10a90fb950fa388b0d0f783a3e3"} Mar 18 16:50:38.049862 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:38.049809 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-q8vrp" podStartSLOduration=1.8619120150000001 podStartE2EDuration="6.049793201s" podCreationTimestamp="2026-03-18 16:50:32 +0000 UTC" firstStartedPulling="2026-03-18 16:50:33.170731154 +0000 UTC m=+355.755505158" lastFinishedPulling="2026-03-18 16:50:37.358612338 +0000 UTC m=+359.943386344" observedRunningTime="2026-03-18 16:50:38.048448722 +0000 UTC m=+360.633222748" watchObservedRunningTime="2026-03-18 16:50:38.049793201 +0000 UTC m=+360.634567243" Mar 18 16:50:50.067570 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.067530 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d495f8cc9-mdmhb" podUID="97c5fe9c-9531-416c-9e82-bd4652596c45" containerName="console" containerID="cri-o://f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7" gracePeriod=15 Mar 18 16:50:50.308122 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.308082 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d495f8cc9-mdmhb_97c5fe9c-9531-416c-9e82-bd4652596c45/console/0.log" Mar 18 16:50:50.308238 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.308166 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:50:50.393653 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.393624 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-oauth-config\") pod \"97c5fe9c-9531-416c-9e82-bd4652596c45\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " Mar 18 16:50:50.393817 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.393687 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-oauth-serving-cert\") pod \"97c5fe9c-9531-416c-9e82-bd4652596c45\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " Mar 18 16:50:50.393817 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.393709 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-service-ca\") pod \"97c5fe9c-9531-416c-9e82-bd4652596c45\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " Mar 18 16:50:50.393817 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.393735 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcdn2\" (UniqueName: \"kubernetes.io/projected/97c5fe9c-9531-416c-9e82-bd4652596c45-kube-api-access-tcdn2\") pod \"97c5fe9c-9531-416c-9e82-bd4652596c45\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " Mar 18 16:50:50.393817 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.393762 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-serving-cert\") pod \"97c5fe9c-9531-416c-9e82-bd4652596c45\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " Mar 18 16:50:50.393817 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.393808 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-console-config\") pod \"97c5fe9c-9531-416c-9e82-bd4652596c45\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " Mar 18 16:50:50.394064 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.393827 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-trusted-ca-bundle\") pod \"97c5fe9c-9531-416c-9e82-bd4652596c45\" (UID: \"97c5fe9c-9531-416c-9e82-bd4652596c45\") " Mar 18 16:50:50.394251 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.394229 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "97c5fe9c-9531-416c-9e82-bd4652596c45" (UID: "97c5fe9c-9531-416c-9e82-bd4652596c45"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:50:50.394251 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.394242 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-service-ca" (OuterVolumeSpecName: "service-ca") pod "97c5fe9c-9531-416c-9e82-bd4652596c45" (UID: "97c5fe9c-9531-416c-9e82-bd4652596c45"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:50:50.394373 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.394220 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-console-config" (OuterVolumeSpecName: "console-config") pod "97c5fe9c-9531-416c-9e82-bd4652596c45" (UID: "97c5fe9c-9531-416c-9e82-bd4652596c45"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:50:50.394373 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.394355 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "97c5fe9c-9531-416c-9e82-bd4652596c45" (UID: "97c5fe9c-9531-416c-9e82-bd4652596c45"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:50:50.396216 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.396192 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "97c5fe9c-9531-416c-9e82-bd4652596c45" (UID: "97c5fe9c-9531-416c-9e82-bd4652596c45"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:50:50.396344 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.396193 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c5fe9c-9531-416c-9e82-bd4652596c45-kube-api-access-tcdn2" (OuterVolumeSpecName: "kube-api-access-tcdn2") pod "97c5fe9c-9531-416c-9e82-bd4652596c45" (UID: "97c5fe9c-9531-416c-9e82-bd4652596c45"). InnerVolumeSpecName "kube-api-access-tcdn2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:50:50.396344 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.396264 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "97c5fe9c-9531-416c-9e82-bd4652596c45" (UID: "97c5fe9c-9531-416c-9e82-bd4652596c45"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:50:50.495205 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.495175 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-console-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:50:50.495205 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.495202 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-trusted-ca-bundle\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:50:50.495205 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.495215 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-oauth-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:50:50.495413 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.495224 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-oauth-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:50:50.495413 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.495233 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/97c5fe9c-9531-416c-9e82-bd4652596c45-service-ca\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:50:50.495413 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.495241 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tcdn2\" (UniqueName: \"kubernetes.io/projected/97c5fe9c-9531-416c-9e82-bd4652596c45-kube-api-access-tcdn2\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:50:50.495413 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:50.495250 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/97c5fe9c-9531-416c-9e82-bd4652596c45-console-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:50:51.069763 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.069738 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d495f8cc9-mdmhb_97c5fe9c-9531-416c-9e82-bd4652596c45/console/0.log" Mar 18 16:50:51.070236 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.069775 2578 generic.go:358] "Generic (PLEG): container finished" podID="97c5fe9c-9531-416c-9e82-bd4652596c45" containerID="f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7" exitCode=2 Mar 18 16:50:51.070236 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.069810 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d495f8cc9-mdmhb" event={"ID":"97c5fe9c-9531-416c-9e82-bd4652596c45","Type":"ContainerDied","Data":"f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7"} Mar 18 16:50:51.070236 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.069830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d495f8cc9-mdmhb" event={"ID":"97c5fe9c-9531-416c-9e82-bd4652596c45","Type":"ContainerDied","Data":"8cea9e4d7cb727df7d72c2f9129f15321eded3d18ebb744868176ef1ab933903"} Mar 18 16:50:51.070236 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.069844 2578 scope.go:117] "RemoveContainer" containerID="f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7" Mar 18 16:50:51.070236 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.069848 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d495f8cc9-mdmhb" Mar 18 16:50:51.078295 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.078275 2578 scope.go:117] "RemoveContainer" containerID="f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7" Mar 18 16:50:51.078516 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:50:51.078500 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7\": container with ID starting with f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7 not found: ID does not exist" containerID="f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7" Mar 18 16:50:51.078559 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.078523 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7"} err="failed to get container status \"f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7\": rpc error: code = NotFound desc = could not find container \"f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7\": container with ID starting with f1fdfe10929036d3867bde031254ef15f6c64c0832f9b063939bee2711cd59f7 not found: ID does not exist" Mar 18 16:50:51.090152 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.090132 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d495f8cc9-mdmhb"] Mar 18 16:50:51.094205 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.094182 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d495f8cc9-mdmhb"] Mar 18 16:50:51.997115 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:50:51.997070 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c5fe9c-9531-416c-9e82-bd4652596c45" path="/var/lib/kubelet/pods/97c5fe9c-9531-416c-9e82-bd4652596c45/volumes" Mar 18 16:51:12.132289 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.132250 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk"] Mar 18 16:51:12.132732 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.132536 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97c5fe9c-9531-416c-9e82-bd4652596c45" containerName="console" Mar 18 16:51:12.132732 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.132548 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c5fe9c-9531-416c-9e82-bd4652596c45" containerName="console" Mar 18 16:51:12.132732 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.132602 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="97c5fe9c-9531-416c-9e82-bd4652596c45" containerName="console" Mar 18 16:51:12.134444 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.134425 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.136343 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.136291 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Mar 18 16:51:12.136606 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.136588 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Mar 18 16:51:12.136936 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.136920 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Mar 18 16:51:12.136999 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.136933 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Mar 18 16:51:12.136999 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.136941 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Mar 18 16:51:12.137240 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.137225 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Mar 18 16:51:12.137344 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.137328 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Mar 18 16:51:12.146494 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.146471 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk"] Mar 18 16:51:12.259270 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.259238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-ca\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.259270 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.259271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm786\" (UniqueName: \"kubernetes.io/projected/9ec8ac63-392c-4931-87f7-6f1a88ff800d-kube-api-access-mm786\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.259440 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.259300 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.259440 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.259391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.259507 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.259443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9ec8ac63-392c-4931-87f7-6f1a88ff800d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.259507 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.259473 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-hub\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.359993 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.359951 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.360172 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.360003 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.360172 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.360035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9ec8ac63-392c-4931-87f7-6f1a88ff800d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.360172 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.360065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-hub\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.360172 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.360122 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-ca\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.360172 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.360158 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mm786\" (UniqueName: \"kubernetes.io/projected/9ec8ac63-392c-4931-87f7-6f1a88ff800d-kube-api-access-mm786\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.360987 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.360870 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9ec8ac63-392c-4931-87f7-6f1a88ff800d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.362707 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.362682 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-ca\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.362809 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.362794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.362856 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.362827 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-hub\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.363031 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.363012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9ec8ac63-392c-4931-87f7-6f1a88ff800d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.367640 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.367620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm786\" (UniqueName: \"kubernetes.io/projected/9ec8ac63-392c-4931-87f7-6f1a88ff800d-kube-api-access-mm786\") pod \"cluster-proxy-proxy-agent-5d7c846b96-9qblk\" (UID: \"9ec8ac63-392c-4931-87f7-6f1a88ff800d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.456374 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.456340 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" Mar 18 16:51:12.576682 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:12.576638 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk"] Mar 18 16:51:12.580331 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:51:12.580303 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ec8ac63_392c_4931_87f7_6f1a88ff800d.slice/crio-8be029dd34b6ef6ef461835ef5eb7686b072933149603664232b3187951ccfe0 WatchSource:0}: Error finding container 8be029dd34b6ef6ef461835ef5eb7686b072933149603664232b3187951ccfe0: Status 404 returned error can't find the container with id 8be029dd34b6ef6ef461835ef5eb7686b072933149603664232b3187951ccfe0 Mar 18 16:51:13.131703 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:13.131667 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" event={"ID":"9ec8ac63-392c-4931-87f7-6f1a88ff800d","Type":"ContainerStarted","Data":"8be029dd34b6ef6ef461835ef5eb7686b072933149603664232b3187951ccfe0"} Mar 18 16:51:16.142272 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:16.142241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" event={"ID":"9ec8ac63-392c-4931-87f7-6f1a88ff800d","Type":"ContainerStarted","Data":"8f2e22bc0c60459a66ce9908852228009e8eaf9f941cbfe7b20e532940d85a64"} Mar 18 16:51:18.149827 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:18.149778 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" event={"ID":"9ec8ac63-392c-4931-87f7-6f1a88ff800d","Type":"ContainerStarted","Data":"67b5d92f7c83f115e4c09035bd9d7988d57e7d1a2fa4d31e4df88fb65b7977c1"} Mar 18 16:51:18.149827 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:18.149820 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" event={"ID":"9ec8ac63-392c-4931-87f7-6f1a88ff800d","Type":"ContainerStarted","Data":"1368a0701ac7864a7ccc29fdbf50b497eed2c546fcb30e23df4860cc4f95ecf8"} Mar 18 16:51:18.168443 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:18.168381 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5d7c846b96-9qblk" podStartSLOduration=1.207253987 podStartE2EDuration="6.168368455s" podCreationTimestamp="2026-03-18 16:51:12 +0000 UTC" firstStartedPulling="2026-03-18 16:51:12.582011536 +0000 UTC m=+395.166785543" lastFinishedPulling="2026-03-18 16:51:17.543125995 +0000 UTC m=+400.127900011" observedRunningTime="2026-03-18 16:51:18.167263786 +0000 UTC m=+400.752037810" watchObservedRunningTime="2026-03-18 16:51:18.168368455 +0000 UTC m=+400.753142479" Mar 18 16:51:51.771729 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.771688 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-7cdhx"] Mar 18 16:51:51.778767 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.778745 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-7cdhx" Mar 18 16:51:51.780818 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.780793 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Mar 18 16:51:51.780932 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.780884 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-97v2s\"" Mar 18 16:51:51.780932 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.780889 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Mar 18 16:51:51.781373 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.781350 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Mar 18 16:51:51.781373 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.781370 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Mar 18 16:51:51.785050 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.785030 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-7cdhx"] Mar 18 16:51:51.852315 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.852292 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xwth\" (UniqueName: \"kubernetes.io/projected/17f44b5c-cc93-449a-8d75-47c8e63ccdcf-kube-api-access-7xwth\") pod \"keda-admission-cf49989db-7cdhx\" (UID: \"17f44b5c-cc93-449a-8d75-47c8e63ccdcf\") " pod="openshift-keda/keda-admission-cf49989db-7cdhx" Mar 18 16:51:51.852452 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.852336 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17f44b5c-cc93-449a-8d75-47c8e63ccdcf-certificates\") pod \"keda-admission-cf49989db-7cdhx\" (UID: \"17f44b5c-cc93-449a-8d75-47c8e63ccdcf\") " pod="openshift-keda/keda-admission-cf49989db-7cdhx" Mar 18 16:51:51.953193 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.953161 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xwth\" (UniqueName: \"kubernetes.io/projected/17f44b5c-cc93-449a-8d75-47c8e63ccdcf-kube-api-access-7xwth\") pod \"keda-admission-cf49989db-7cdhx\" (UID: \"17f44b5c-cc93-449a-8d75-47c8e63ccdcf\") " pod="openshift-keda/keda-admission-cf49989db-7cdhx" Mar 18 16:51:51.953342 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.953206 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17f44b5c-cc93-449a-8d75-47c8e63ccdcf-certificates\") pod \"keda-admission-cf49989db-7cdhx\" (UID: \"17f44b5c-cc93-449a-8d75-47c8e63ccdcf\") " pod="openshift-keda/keda-admission-cf49989db-7cdhx" Mar 18 16:51:51.955671 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.955649 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17f44b5c-cc93-449a-8d75-47c8e63ccdcf-certificates\") pod \"keda-admission-cf49989db-7cdhx\" (UID: \"17f44b5c-cc93-449a-8d75-47c8e63ccdcf\") " pod="openshift-keda/keda-admission-cf49989db-7cdhx" Mar 18 16:51:51.959973 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:51.959953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xwth\" (UniqueName: \"kubernetes.io/projected/17f44b5c-cc93-449a-8d75-47c8e63ccdcf-kube-api-access-7xwth\") pod \"keda-admission-cf49989db-7cdhx\" (UID: \"17f44b5c-cc93-449a-8d75-47c8e63ccdcf\") " pod="openshift-keda/keda-admission-cf49989db-7cdhx" Mar 18 16:51:52.090442 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:52.090369 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-7cdhx" Mar 18 16:51:52.205648 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:52.205618 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-7cdhx"] Mar 18 16:51:52.208430 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:51:52.208396 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f44b5c_cc93_449a_8d75_47c8e63ccdcf.slice/crio-5158fd1e271ca48e76045281e79c7c56d7157fdcea6b6d3df2f0fb686e6ff700 WatchSource:0}: Error finding container 5158fd1e271ca48e76045281e79c7c56d7157fdcea6b6d3df2f0fb686e6ff700: Status 404 returned error can't find the container with id 5158fd1e271ca48e76045281e79c7c56d7157fdcea6b6d3df2f0fb686e6ff700 Mar 18 16:51:52.241836 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:51:52.241808 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-7cdhx" event={"ID":"17f44b5c-cc93-449a-8d75-47c8e63ccdcf","Type":"ContainerStarted","Data":"5158fd1e271ca48e76045281e79c7c56d7157fdcea6b6d3df2f0fb686e6ff700"} Mar 18 16:52:00.268038 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:52:00.267994 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-7cdhx" event={"ID":"17f44b5c-cc93-449a-8d75-47c8e63ccdcf","Type":"ContainerStarted","Data":"6e747070935535980a3a4c973d4e9a9bc8c1c78e8bc458b29706b91333cdda82"} Mar 18 16:52:00.268474 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:52:00.268133 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-7cdhx" Mar 18 16:52:00.284667 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:52:00.284622 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-7cdhx" podStartSLOduration=1.606620571 podStartE2EDuration="9.284604489s" podCreationTimestamp="2026-03-18 16:51:51 +0000 UTC" firstStartedPulling="2026-03-18 16:51:52.209596197 +0000 UTC m=+434.794370200" lastFinishedPulling="2026-03-18 16:51:59.8875801 +0000 UTC m=+442.472354118" observedRunningTime="2026-03-18 16:52:00.283332831 +0000 UTC m=+442.868106866" watchObservedRunningTime="2026-03-18 16:52:00.284604489 +0000 UTC m=+442.869378514" Mar 18 16:52:21.273110 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:52:21.273073 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-7cdhx" Mar 18 16:53:04.632325 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.632289 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-brrs5"] Mar 18 16:53:04.635783 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.635760 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" Mar 18 16:53:04.637658 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.637639 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Mar 18 16:53:04.637988 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.637969 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Mar 18 16:53:04.638067 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.637976 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-2l29j\"" Mar 18 16:53:04.644378 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.644354 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-brrs5"] Mar 18 16:53:04.793176 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.793137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l87w\" (UniqueName: \"kubernetes.io/projected/1a085bb5-5a70-408e-88d4-0ed55e7ef188-kube-api-access-9l87w\") pod \"cert-manager-webhook-597b96b99b-brrs5\" (UID: \"1a085bb5-5a70-408e-88d4-0ed55e7ef188\") " pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" Mar 18 16:53:04.793347 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.793224 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a085bb5-5a70-408e-88d4-0ed55e7ef188-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-brrs5\" (UID: \"1a085bb5-5a70-408e-88d4-0ed55e7ef188\") " pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" Mar 18 16:53:04.894466 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.894434 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a085bb5-5a70-408e-88d4-0ed55e7ef188-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-brrs5\" (UID: \"1a085bb5-5a70-408e-88d4-0ed55e7ef188\") " pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" Mar 18 16:53:04.894611 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.894483 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l87w\" (UniqueName: \"kubernetes.io/projected/1a085bb5-5a70-408e-88d4-0ed55e7ef188-kube-api-access-9l87w\") pod \"cert-manager-webhook-597b96b99b-brrs5\" (UID: \"1a085bb5-5a70-408e-88d4-0ed55e7ef188\") " pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" Mar 18 16:53:04.902156 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.902125 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1a085bb5-5a70-408e-88d4-0ed55e7ef188-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-brrs5\" (UID: \"1a085bb5-5a70-408e-88d4-0ed55e7ef188\") " pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" Mar 18 16:53:04.902156 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.902147 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l87w\" (UniqueName: \"kubernetes.io/projected/1a085bb5-5a70-408e-88d4-0ed55e7ef188-kube-api-access-9l87w\") pod \"cert-manager-webhook-597b96b99b-brrs5\" (UID: \"1a085bb5-5a70-408e-88d4-0ed55e7ef188\") " pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" Mar 18 16:53:04.944613 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:04.944580 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" Mar 18 16:53:05.063719 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:05.063617 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-brrs5"] Mar 18 16:53:05.066627 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:53:05.066599 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a085bb5_5a70_408e_88d4_0ed55e7ef188.slice/crio-e4af645fc99e8af2d83f4167f2a0db5008bdf8fdbd8513ede655383e5f919bb4 WatchSource:0}: Error finding container e4af645fc99e8af2d83f4167f2a0db5008bdf8fdbd8513ede655383e5f919bb4: Status 404 returned error can't find the container with id e4af645fc99e8af2d83f4167f2a0db5008bdf8fdbd8513ede655383e5f919bb4 Mar 18 16:53:05.449679 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:05.449627 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" event={"ID":"1a085bb5-5a70-408e-88d4-0ed55e7ef188","Type":"ContainerStarted","Data":"e4af645fc99e8af2d83f4167f2a0db5008bdf8fdbd8513ede655383e5f919bb4"} Mar 18 16:53:08.461724 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:08.461634 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" event={"ID":"1a085bb5-5a70-408e-88d4-0ed55e7ef188","Type":"ContainerStarted","Data":"fffe80fc130c23d6c1789b1c7cf00cb213857509dc5111a7eef374ebebbb92ae"} Mar 18 16:53:08.462073 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:08.461768 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" Mar 18 16:53:08.480826 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:08.480777 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" podStartSLOduration=1.366422523 podStartE2EDuration="4.480762562s" podCreationTimestamp="2026-03-18 16:53:04 +0000 UTC" firstStartedPulling="2026-03-18 16:53:05.06893015 +0000 UTC m=+507.653704155" lastFinishedPulling="2026-03-18 16:53:08.183270187 +0000 UTC m=+510.768044194" observedRunningTime="2026-03-18 16:53:08.480073862 +0000 UTC m=+511.064847890" watchObservedRunningTime="2026-03-18 16:53:08.480762562 +0000 UTC m=+511.065536589" Mar 18 16:53:14.466614 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:14.466582 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-brrs5" Mar 18 16:53:23.749171 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:23.749138 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-rnfh9"] Mar 18 16:53:23.751919 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:23.751903 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-rnfh9" Mar 18 16:53:23.753743 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:23.753722 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-vj2c2\"" Mar 18 16:53:23.761563 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:23.761542 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-rnfh9"] Mar 18 16:53:23.829085 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:23.829054 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck47r\" (UniqueName: \"kubernetes.io/projected/6fc4cf95-6465-4343-98ff-bba22006b480-kube-api-access-ck47r\") pod \"cert-manager-759f64656b-rnfh9\" (UID: \"6fc4cf95-6465-4343-98ff-bba22006b480\") " pod="cert-manager/cert-manager-759f64656b-rnfh9" Mar 18 16:53:23.829241 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:23.829117 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6fc4cf95-6465-4343-98ff-bba22006b480-bound-sa-token\") pod \"cert-manager-759f64656b-rnfh9\" (UID: \"6fc4cf95-6465-4343-98ff-bba22006b480\") " pod="cert-manager/cert-manager-759f64656b-rnfh9" Mar 18 16:53:23.930274 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:23.930243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck47r\" (UniqueName: \"kubernetes.io/projected/6fc4cf95-6465-4343-98ff-bba22006b480-kube-api-access-ck47r\") pod \"cert-manager-759f64656b-rnfh9\" (UID: \"6fc4cf95-6465-4343-98ff-bba22006b480\") " pod="cert-manager/cert-manager-759f64656b-rnfh9" Mar 18 16:53:23.930405 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:23.930284 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6fc4cf95-6465-4343-98ff-bba22006b480-bound-sa-token\") pod \"cert-manager-759f64656b-rnfh9\" (UID: \"6fc4cf95-6465-4343-98ff-bba22006b480\") " pod="cert-manager/cert-manager-759f64656b-rnfh9" Mar 18 16:53:23.938275 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:23.938250 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6fc4cf95-6465-4343-98ff-bba22006b480-bound-sa-token\") pod \"cert-manager-759f64656b-rnfh9\" (UID: \"6fc4cf95-6465-4343-98ff-bba22006b480\") " pod="cert-manager/cert-manager-759f64656b-rnfh9" Mar 18 16:53:23.938443 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:23.938423 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck47r\" (UniqueName: \"kubernetes.io/projected/6fc4cf95-6465-4343-98ff-bba22006b480-kube-api-access-ck47r\") pod \"cert-manager-759f64656b-rnfh9\" (UID: \"6fc4cf95-6465-4343-98ff-bba22006b480\") " pod="cert-manager/cert-manager-759f64656b-rnfh9" Mar 18 16:53:24.060904 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:24.060838 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-rnfh9" Mar 18 16:53:24.176521 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:24.176487 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-rnfh9"] Mar 18 16:53:24.179505 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:53:24.179471 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fc4cf95_6465_4343_98ff_bba22006b480.slice/crio-1863bd4ea0b58ee3eae94ed9d6d7ffe4599e6f7803906813f7463882440834ad WatchSource:0}: Error finding container 1863bd4ea0b58ee3eae94ed9d6d7ffe4599e6f7803906813f7463882440834ad: Status 404 returned error can't find the container with id 1863bd4ea0b58ee3eae94ed9d6d7ffe4599e6f7803906813f7463882440834ad Mar 18 16:53:24.508075 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:24.508037 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-rnfh9" event={"ID":"6fc4cf95-6465-4343-98ff-bba22006b480","Type":"ContainerStarted","Data":"8a6f70e2238635a556e562760f401c2cad0046e13ad454a296d420be6a393a03"} Mar 18 16:53:24.508075 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:24.508078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-rnfh9" event={"ID":"6fc4cf95-6465-4343-98ff-bba22006b480","Type":"ContainerStarted","Data":"1863bd4ea0b58ee3eae94ed9d6d7ffe4599e6f7803906813f7463882440834ad"} Mar 18 16:53:24.523140 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:24.523063 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-rnfh9" podStartSLOduration=1.523050536 podStartE2EDuration="1.523050536s" podCreationTimestamp="2026-03-18 16:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:53:24.521525557 +0000 UTC m=+527.106299583" watchObservedRunningTime="2026-03-18 16:53:24.523050536 +0000 UTC m=+527.107824562" Mar 18 16:53:59.107828 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.107797 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj"] Mar 18 16:53:59.110862 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.110841 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.113148 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.113127 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Mar 18 16:53:59.113245 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.113151 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:53:59.113563 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.113549 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Mar 18 16:53:59.113613 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.113571 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5lx2r\"" Mar 18 16:53:59.113809 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.113795 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Mar 18 16:53:59.113945 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.113927 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Mar 18 16:53:59.119641 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.119621 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj"] Mar 18 16:53:59.210227 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.210194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56681957-3dd3-4e13-8afe-1db309b51d33-cert\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.210361 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.210251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/56681957-3dd3-4e13-8afe-1db309b51d33-metrics-cert\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.210361 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.210300 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/56681957-3dd3-4e13-8afe-1db309b51d33-manager-config\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.210361 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.210322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68sff\" (UniqueName: \"kubernetes.io/projected/56681957-3dd3-4e13-8afe-1db309b51d33-kube-api-access-68sff\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.310827 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.310796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/56681957-3dd3-4e13-8afe-1db309b51d33-metrics-cert\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.310933 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.310832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/56681957-3dd3-4e13-8afe-1db309b51d33-manager-config\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.310933 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.310848 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68sff\" (UniqueName: \"kubernetes.io/projected/56681957-3dd3-4e13-8afe-1db309b51d33-kube-api-access-68sff\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.310933 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.310889 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56681957-3dd3-4e13-8afe-1db309b51d33-cert\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.311464 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.311444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/56681957-3dd3-4e13-8afe-1db309b51d33-manager-config\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.313621 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.313598 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/56681957-3dd3-4e13-8afe-1db309b51d33-metrics-cert\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.313692 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.313598 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56681957-3dd3-4e13-8afe-1db309b51d33-cert\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.318524 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.318500 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68sff\" (UniqueName: \"kubernetes.io/projected/56681957-3dd3-4e13-8afe-1db309b51d33-kube-api-access-68sff\") pod \"lws-controller-manager-84978b767b-zgqtj\" (UID: \"56681957-3dd3-4e13-8afe-1db309b51d33\") " pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.420921 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.420885 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:53:59.543718 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.543691 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj"] Mar 18 16:53:59.546833 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:53:59.546801 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56681957_3dd3_4e13_8afe_1db309b51d33.slice/crio-716ae3c4796c826709c42140bf41278859efa00fc8fd82eb68e2cad278481a58 WatchSource:0}: Error finding container 716ae3c4796c826709c42140bf41278859efa00fc8fd82eb68e2cad278481a58: Status 404 returned error can't find the container with id 716ae3c4796c826709c42140bf41278859efa00fc8fd82eb68e2cad278481a58 Mar 18 16:53:59.608557 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:53:59.608526 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" event={"ID":"56681957-3dd3-4e13-8afe-1db309b51d33","Type":"ContainerStarted","Data":"716ae3c4796c826709c42140bf41278859efa00fc8fd82eb68e2cad278481a58"} Mar 18 16:54:02.619229 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:02.619191 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" event={"ID":"56681957-3dd3-4e13-8afe-1db309b51d33","Type":"ContainerStarted","Data":"42b6a7818e9e79de5545d6ce93c46011746356614b502f2bff7031a0294df7a7"} Mar 18 16:54:02.619717 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:02.619367 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:54:02.636262 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:02.636215 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" podStartSLOduration=1.141355588 podStartE2EDuration="3.63620054s" podCreationTimestamp="2026-03-18 16:53:59 +0000 UTC" firstStartedPulling="2026-03-18 16:53:59.54852239 +0000 UTC m=+562.133296394" lastFinishedPulling="2026-03-18 16:54:02.043367339 +0000 UTC m=+564.628141346" observedRunningTime="2026-03-18 16:54:02.635449919 +0000 UTC m=+565.220223946" watchObservedRunningTime="2026-03-18 16:54:02.63620054 +0000 UTC m=+565.220974565" Mar 18 16:54:09.173638 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.173599 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8bbb5d4bc-s8bsb"] Mar 18 16:54:09.176958 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.176942 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.186302 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.186280 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8bbb5d4bc-s8bsb"] Mar 18 16:54:09.291979 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.291946 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-oauth-serving-cert\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.292164 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.291985 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-console-config\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.292164 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.292016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-trusted-ca-bundle\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.292164 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.292079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac9c969c-4d19-4316-8308-37d777523623-console-oauth-config\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.292164 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.292164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-service-ca\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.292330 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.292206 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md2w9\" (UniqueName: \"kubernetes.io/projected/ac9c969c-4d19-4316-8308-37d777523623-kube-api-access-md2w9\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.292330 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.292225 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac9c969c-4d19-4316-8308-37d777523623-console-serving-cert\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.393109 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.393042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-oauth-serving-cert\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.393109 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.393113 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-console-config\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.393354 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.393137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-trusted-ca-bundle\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.393354 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.393156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac9c969c-4d19-4316-8308-37d777523623-console-oauth-config\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.393354 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.393189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-service-ca\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.393354 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.393238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-md2w9\" (UniqueName: \"kubernetes.io/projected/ac9c969c-4d19-4316-8308-37d777523623-kube-api-access-md2w9\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.393354 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.393267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac9c969c-4d19-4316-8308-37d777523623-console-serving-cert\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.393938 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.393912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-console-config\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.394038 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.393957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-service-ca\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.394038 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.393916 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-oauth-serving-cert\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.394347 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.394325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac9c969c-4d19-4316-8308-37d777523623-trusted-ca-bundle\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.395759 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.395734 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac9c969c-4d19-4316-8308-37d777523623-console-oauth-config\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.395962 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.395941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac9c969c-4d19-4316-8308-37d777523623-console-serving-cert\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.400563 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.400546 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-md2w9\" (UniqueName: \"kubernetes.io/projected/ac9c969c-4d19-4316-8308-37d777523623-kube-api-access-md2w9\") pod \"console-8bbb5d4bc-s8bsb\" (UID: \"ac9c969c-4d19-4316-8308-37d777523623\") " pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.486595 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.486437 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:09.608575 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.608538 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8bbb5d4bc-s8bsb"] Mar 18 16:54:09.612267 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:54:09.612238 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac9c969c_4d19_4316_8308_37d777523623.slice/crio-09e5f644de1af074abf15fbdf4f91bd08c742b85b448414a499719e09301efce WatchSource:0}: Error finding container 09e5f644de1af074abf15fbdf4f91bd08c742b85b448414a499719e09301efce: Status 404 returned error can't find the container with id 09e5f644de1af074abf15fbdf4f91bd08c742b85b448414a499719e09301efce Mar 18 16:54:09.638253 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:09.638228 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8bbb5d4bc-s8bsb" event={"ID":"ac9c969c-4d19-4316-8308-37d777523623","Type":"ContainerStarted","Data":"09e5f644de1af074abf15fbdf4f91bd08c742b85b448414a499719e09301efce"} Mar 18 16:54:10.642014 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:10.641976 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8bbb5d4bc-s8bsb" event={"ID":"ac9c969c-4d19-4316-8308-37d777523623","Type":"ContainerStarted","Data":"550eaf38c1a9cd2ba13ebb22c5020c33cb0b21d7db8ab8cecb0172434d5ed70e"} Mar 18 16:54:10.661305 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:10.661253 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8bbb5d4bc-s8bsb" podStartSLOduration=1.6612379910000001 podStartE2EDuration="1.661237991s" podCreationTimestamp="2026-03-18 16:54:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:54:10.659519082 +0000 UTC m=+573.244293118" watchObservedRunningTime="2026-03-18 16:54:10.661237991 +0000 UTC m=+573.246012017" Mar 18 16:54:13.624023 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:13.623992 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-84978b767b-zgqtj" Mar 18 16:54:19.486899 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:19.486866 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:19.486899 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:19.486910 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:19.492034 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:19.492010 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:19.669672 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:19.669644 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8bbb5d4bc-s8bsb" Mar 18 16:54:19.722949 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:19.722912 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78848b98d5-2zhnt"] Mar 18 16:54:35.022297 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.022260 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-hmt7v"] Mar 18 16:54:35.025715 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.025689 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" Mar 18 16:54:35.027719 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.027700 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Mar 18 16:54:35.027839 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.027701 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Mar 18 16:54:35.028162 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.028147 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Mar 18 16:54:35.028274 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.028151 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-sd7tx\"" Mar 18 16:54:35.033871 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.033852 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-hmt7v"] Mar 18 16:54:35.049989 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.049966 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-hmt7v"] Mar 18 16:54:35.098058 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.098036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/39d41b2c-a876-4347-9f78-5b750630617b-config-file\") pod \"limitador-limitador-67566c68b4-hmt7v\" (UID: \"39d41b2c-a876-4347-9f78-5b750630617b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" Mar 18 16:54:35.098181 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.098064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdwst\" (UniqueName: \"kubernetes.io/projected/39d41b2c-a876-4347-9f78-5b750630617b-kube-api-access-fdwst\") pod \"limitador-limitador-67566c68b4-hmt7v\" (UID: \"39d41b2c-a876-4347-9f78-5b750630617b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" Mar 18 16:54:35.198889 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.198863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/39d41b2c-a876-4347-9f78-5b750630617b-config-file\") pod \"limitador-limitador-67566c68b4-hmt7v\" (UID: \"39d41b2c-a876-4347-9f78-5b750630617b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" Mar 18 16:54:35.199040 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.198894 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdwst\" (UniqueName: \"kubernetes.io/projected/39d41b2c-a876-4347-9f78-5b750630617b-kube-api-access-fdwst\") pod \"limitador-limitador-67566c68b4-hmt7v\" (UID: \"39d41b2c-a876-4347-9f78-5b750630617b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" Mar 18 16:54:35.199451 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.199432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/39d41b2c-a876-4347-9f78-5b750630617b-config-file\") pod \"limitador-limitador-67566c68b4-hmt7v\" (UID: \"39d41b2c-a876-4347-9f78-5b750630617b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" Mar 18 16:54:35.206254 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.206226 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdwst\" (UniqueName: \"kubernetes.io/projected/39d41b2c-a876-4347-9f78-5b750630617b-kube-api-access-fdwst\") pod \"limitador-limitador-67566c68b4-hmt7v\" (UID: \"39d41b2c-a876-4347-9f78-5b750630617b\") " pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" Mar 18 16:54:35.337234 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.337160 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" Mar 18 16:54:35.355541 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.355511 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-g4qrs"] Mar 18 16:54:35.361675 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.361649 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-g4qrs" Mar 18 16:54:35.364348 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.364326 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-mgks8\"" Mar 18 16:54:35.373063 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.373037 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-g4qrs"] Mar 18 16:54:35.401288 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.401145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjc9q\" (UniqueName: \"kubernetes.io/projected/0a867b61-20f9-42ec-9035-e6e1e662a8ca-kube-api-access-fjc9q\") pod \"authorino-674b59b84c-g4qrs\" (UID: \"0a867b61-20f9-42ec-9035-e6e1e662a8ca\") " pod="kuadrant-system/authorino-674b59b84c-g4qrs" Mar 18 16:54:35.465361 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.465331 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-hmt7v"] Mar 18 16:54:35.468652 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:54:35.468625 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d41b2c_a876_4347_9f78_5b750630617b.slice/crio-ba5f15ae15386222af4355a2cfc1210a12b335e46af81fe461f44452b88d0ed3 WatchSource:0}: Error finding container ba5f15ae15386222af4355a2cfc1210a12b335e46af81fe461f44452b88d0ed3: Status 404 returned error can't find the container with id ba5f15ae15386222af4355a2cfc1210a12b335e46af81fe461f44452b88d0ed3 Mar 18 16:54:35.502457 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.502429 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjc9q\" (UniqueName: \"kubernetes.io/projected/0a867b61-20f9-42ec-9035-e6e1e662a8ca-kube-api-access-fjc9q\") pod \"authorino-674b59b84c-g4qrs\" (UID: \"0a867b61-20f9-42ec-9035-e6e1e662a8ca\") " pod="kuadrant-system/authorino-674b59b84c-g4qrs" Mar 18 16:54:35.509725 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.509705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjc9q\" (UniqueName: \"kubernetes.io/projected/0a867b61-20f9-42ec-9035-e6e1e662a8ca-kube-api-access-fjc9q\") pod \"authorino-674b59b84c-g4qrs\" (UID: \"0a867b61-20f9-42ec-9035-e6e1e662a8ca\") " pod="kuadrant-system/authorino-674b59b84c-g4qrs" Mar 18 16:54:35.674211 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.674180 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-g4qrs" Mar 18 16:54:35.713710 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.713674 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" event={"ID":"39d41b2c-a876-4347-9f78-5b750630617b","Type":"ContainerStarted","Data":"ba5f15ae15386222af4355a2cfc1210a12b335e46af81fe461f44452b88d0ed3"} Mar 18 16:54:35.799651 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:35.799614 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-g4qrs"] Mar 18 16:54:35.803009 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:54:35.802980 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a867b61_20f9_42ec_9035_e6e1e662a8ca.slice/crio-6414a2f4decf684d5ab2823d3ade1c9bd5b2bcc71034245b366c7c45251fa77e WatchSource:0}: Error finding container 6414a2f4decf684d5ab2823d3ade1c9bd5b2bcc71034245b366c7c45251fa77e: Status 404 returned error can't find the container with id 6414a2f4decf684d5ab2823d3ade1c9bd5b2bcc71034245b366c7c45251fa77e Mar 18 16:54:36.718229 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:36.718186 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-g4qrs" event={"ID":"0a867b61-20f9-42ec-9035-e6e1e662a8ca","Type":"ContainerStarted","Data":"6414a2f4decf684d5ab2823d3ade1c9bd5b2bcc71034245b366c7c45251fa77e"} Mar 18 16:54:39.885832 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:39.885790 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-g4qrs"] Mar 18 16:54:40.574982 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.574956 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-jh7zd_73e43277-038f-4657-95b6-addae5fb597c/cluster-monitoring-operator/0.log" Mar 18 16:54:40.575282 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.575052 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-jh7zd_73e43277-038f-4657-95b6-addae5fb597c/cluster-monitoring-operator/0.log" Mar 18 16:54:40.581791 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.581771 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-acl-logging/0.log" Mar 18 16:54:40.581876 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.581802 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-acl-logging/0.log" Mar 18 16:54:40.735000 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.734948 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" event={"ID":"39d41b2c-a876-4347-9f78-5b750630617b","Type":"ContainerStarted","Data":"bac97af1c234c9d873c79c7ede4a2348a110ee5174d9394917c90c009899b781"} Mar 18 16:54:40.735212 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.735187 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" Mar 18 16:54:40.736562 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.736528 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-g4qrs" event={"ID":"0a867b61-20f9-42ec-9035-e6e1e662a8ca","Type":"ContainerStarted","Data":"41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d"} Mar 18 16:54:40.736691 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.736629 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-g4qrs" podUID="0a867b61-20f9-42ec-9035-e6e1e662a8ca" containerName="authorino" containerID="cri-o://41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d" gracePeriod=30 Mar 18 16:54:40.753146 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.753086 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" podStartSLOduration=0.594388685 podStartE2EDuration="5.753072946s" podCreationTimestamp="2026-03-18 16:54:35 +0000 UTC" firstStartedPulling="2026-03-18 16:54:35.470593415 +0000 UTC m=+598.055367419" lastFinishedPulling="2026-03-18 16:54:40.629277662 +0000 UTC m=+603.214051680" observedRunningTime="2026-03-18 16:54:40.750494833 +0000 UTC m=+603.335268878" watchObservedRunningTime="2026-03-18 16:54:40.753072946 +0000 UTC m=+603.337846971" Mar 18 16:54:40.765491 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.765454 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-g4qrs" podStartSLOduration=0.99084552 podStartE2EDuration="5.765441425s" podCreationTimestamp="2026-03-18 16:54:35 +0000 UTC" firstStartedPulling="2026-03-18 16:54:35.804616204 +0000 UTC m=+598.389390222" lastFinishedPulling="2026-03-18 16:54:40.579212123 +0000 UTC m=+603.163986127" observedRunningTime="2026-03-18 16:54:40.764821115 +0000 UTC m=+603.349595155" watchObservedRunningTime="2026-03-18 16:54:40.765441425 +0000 UTC m=+603.350215464" Mar 18 16:54:40.971451 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:40.971422 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-g4qrs" Mar 18 16:54:41.052329 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.052296 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjc9q\" (UniqueName: \"kubernetes.io/projected/0a867b61-20f9-42ec-9035-e6e1e662a8ca-kube-api-access-fjc9q\") pod \"0a867b61-20f9-42ec-9035-e6e1e662a8ca\" (UID: \"0a867b61-20f9-42ec-9035-e6e1e662a8ca\") " Mar 18 16:54:41.054571 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.054549 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a867b61-20f9-42ec-9035-e6e1e662a8ca-kube-api-access-fjc9q" (OuterVolumeSpecName: "kube-api-access-fjc9q") pod "0a867b61-20f9-42ec-9035-e6e1e662a8ca" (UID: "0a867b61-20f9-42ec-9035-e6e1e662a8ca"). InnerVolumeSpecName "kube-api-access-fjc9q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:54:41.153844 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.153812 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fjc9q\" (UniqueName: \"kubernetes.io/projected/0a867b61-20f9-42ec-9035-e6e1e662a8ca-kube-api-access-fjc9q\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:54:41.740248 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.740207 2578 generic.go:358] "Generic (PLEG): container finished" podID="0a867b61-20f9-42ec-9035-e6e1e662a8ca" containerID="41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d" exitCode=0 Mar 18 16:54:41.740434 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.740256 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-g4qrs" Mar 18 16:54:41.740434 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.740290 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-g4qrs" event={"ID":"0a867b61-20f9-42ec-9035-e6e1e662a8ca","Type":"ContainerDied","Data":"41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d"} Mar 18 16:54:41.740434 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.740329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-g4qrs" event={"ID":"0a867b61-20f9-42ec-9035-e6e1e662a8ca","Type":"ContainerDied","Data":"6414a2f4decf684d5ab2823d3ade1c9bd5b2bcc71034245b366c7c45251fa77e"} Mar 18 16:54:41.740434 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.740344 2578 scope.go:117] "RemoveContainer" containerID="41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d" Mar 18 16:54:41.748790 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.748771 2578 scope.go:117] "RemoveContainer" containerID="41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d" Mar 18 16:54:41.749045 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:54:41.749027 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d\": container with ID starting with 41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d not found: ID does not exist" containerID="41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d" Mar 18 16:54:41.749108 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.749054 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d"} err="failed to get container status \"41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d\": rpc error: code = NotFound desc = could not find container \"41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d\": container with ID starting with 41d79d9780f707fd95e92def9ce7c23fc0e9f925f09a3fdab5bcb7fbbfd8ca3d not found: ID does not exist" Mar 18 16:54:41.758315 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.758296 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-g4qrs"] Mar 18 16:54:41.761863 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.761838 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-g4qrs"] Mar 18 16:54:41.997297 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:41.997265 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a867b61-20f9-42ec-9035-e6e1e662a8ca" path="/var/lib/kubelet/pods/0a867b61-20f9-42ec-9035-e6e1e662a8ca/volumes" Mar 18 16:54:44.742452 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:44.742400 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-78848b98d5-2zhnt" podUID="b9b688fb-84bc-47a9-90e7-c46bad6b1df4" containerName="console" containerID="cri-o://8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053" gracePeriod=15 Mar 18 16:54:44.984085 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:44.984065 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78848b98d5-2zhnt_b9b688fb-84bc-47a9-90e7-c46bad6b1df4/console/0.log" Mar 18 16:54:44.984227 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:44.984150 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:54:45.087044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.086960 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnfk4\" (UniqueName: \"kubernetes.io/projected/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-kube-api-access-vnfk4\") pod \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " Mar 18 16:54:45.087044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.086998 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-service-ca\") pod \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " Mar 18 16:54:45.087044 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.087035 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-oauth-config\") pod \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " Mar 18 16:54:45.087333 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.087067 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-oauth-serving-cert\") pod \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " Mar 18 16:54:45.087333 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.087117 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-config\") pod \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " Mar 18 16:54:45.087333 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.087157 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-trusted-ca-bundle\") pod \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " Mar 18 16:54:45.087333 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.087181 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-serving-cert\") pod \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\" (UID: \"b9b688fb-84bc-47a9-90e7-c46bad6b1df4\") " Mar 18 16:54:45.087531 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.087490 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b9b688fb-84bc-47a9-90e7-c46bad6b1df4" (UID: "b9b688fb-84bc-47a9-90e7-c46bad6b1df4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:54:45.087597 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.087534 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-config" (OuterVolumeSpecName: "console-config") pod "b9b688fb-84bc-47a9-90e7-c46bad6b1df4" (UID: "b9b688fb-84bc-47a9-90e7-c46bad6b1df4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:54:45.087597 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.087541 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-service-ca" (OuterVolumeSpecName: "service-ca") pod "b9b688fb-84bc-47a9-90e7-c46bad6b1df4" (UID: "b9b688fb-84bc-47a9-90e7-c46bad6b1df4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:54:45.087597 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.087551 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b9b688fb-84bc-47a9-90e7-c46bad6b1df4" (UID: "b9b688fb-84bc-47a9-90e7-c46bad6b1df4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:54:45.089471 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.089442 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b9b688fb-84bc-47a9-90e7-c46bad6b1df4" (UID: "b9b688fb-84bc-47a9-90e7-c46bad6b1df4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:54:45.089570 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.089495 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b9b688fb-84bc-47a9-90e7-c46bad6b1df4" (UID: "b9b688fb-84bc-47a9-90e7-c46bad6b1df4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:54:45.089570 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.089531 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-kube-api-access-vnfk4" (OuterVolumeSpecName: "kube-api-access-vnfk4") pod "b9b688fb-84bc-47a9-90e7-c46bad6b1df4" (UID: "b9b688fb-84bc-47a9-90e7-c46bad6b1df4"). InnerVolumeSpecName "kube-api-access-vnfk4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:54:45.188157 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.188122 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-oauth-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:54:45.188157 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.188156 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-oauth-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:54:45.188327 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.188167 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-config\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:54:45.188327 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.188177 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-trusted-ca-bundle\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:54:45.188327 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.188185 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-console-serving-cert\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:54:45.188327 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.188193 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vnfk4\" (UniqueName: \"kubernetes.io/projected/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-kube-api-access-vnfk4\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:54:45.188327 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.188202 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b9b688fb-84bc-47a9-90e7-c46bad6b1df4-service-ca\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:54:45.753839 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.753812 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78848b98d5-2zhnt_b9b688fb-84bc-47a9-90e7-c46bad6b1df4/console/0.log" Mar 18 16:54:45.754253 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.753856 2578 generic.go:358] "Generic (PLEG): container finished" podID="b9b688fb-84bc-47a9-90e7-c46bad6b1df4" containerID="8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053" exitCode=2 Mar 18 16:54:45.754253 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.753927 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78848b98d5-2zhnt" Mar 18 16:54:45.754253 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.753945 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78848b98d5-2zhnt" event={"ID":"b9b688fb-84bc-47a9-90e7-c46bad6b1df4","Type":"ContainerDied","Data":"8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053"} Mar 18 16:54:45.754253 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.753984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78848b98d5-2zhnt" event={"ID":"b9b688fb-84bc-47a9-90e7-c46bad6b1df4","Type":"ContainerDied","Data":"4eb10dba660b06af11b0bef67dfe6e9ff8b47c366d9f2e5762f4921aaa3658d1"} Mar 18 16:54:45.754253 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.753998 2578 scope.go:117] "RemoveContainer" containerID="8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053" Mar 18 16:54:45.763700 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.763676 2578 scope.go:117] "RemoveContainer" containerID="8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053" Mar 18 16:54:45.763929 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:54:45.763909 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053\": container with ID starting with 8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053 not found: ID does not exist" containerID="8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053" Mar 18 16:54:45.763975 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.763937 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053"} err="failed to get container status \"8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053\": rpc error: code = NotFound desc = could not find container \"8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053\": container with ID starting with 8c38d481f13cc53709b22b1812e7f7408d9614d3b2a47ac4c09cbe740308c053 not found: ID does not exist" Mar 18 16:54:45.774466 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.774443 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78848b98d5-2zhnt"] Mar 18 16:54:45.779504 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.779485 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78848b98d5-2zhnt"] Mar 18 16:54:45.997185 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:45.997156 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b688fb-84bc-47a9-90e7-c46bad6b1df4" path="/var/lib/kubelet/pods/b9b688fb-84bc-47a9-90e7-c46bad6b1df4/volumes" Mar 18 16:54:51.741998 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:54:51.741969 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-hmt7v" Mar 18 16:56:10.864110 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.864064 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-54n22"] Mar 18 16:56:10.864564 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.864414 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a867b61-20f9-42ec-9035-e6e1e662a8ca" containerName="authorino" Mar 18 16:56:10.864564 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.864427 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a867b61-20f9-42ec-9035-e6e1e662a8ca" containerName="authorino" Mar 18 16:56:10.864564 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.864436 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9b688fb-84bc-47a9-90e7-c46bad6b1df4" containerName="console" Mar 18 16:56:10.864564 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.864441 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b688fb-84bc-47a9-90e7-c46bad6b1df4" containerName="console" Mar 18 16:56:10.864564 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.864495 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9b688fb-84bc-47a9-90e7-c46bad6b1df4" containerName="console" Mar 18 16:56:10.864564 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.864506 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a867b61-20f9-42ec-9035-e6e1e662a8ca" containerName="authorino" Mar 18 16:56:10.867306 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.867290 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-54n22" Mar 18 16:56:10.869004 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.868970 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Mar 18 16:56:10.869145 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.869067 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:56:10.869483 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.869467 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-hzfrv\"" Mar 18 16:56:10.869543 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.869472 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:56:10.873157 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.873136 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-54n22"] Mar 18 16:56:10.946406 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:10.946375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgq67\" (UniqueName: \"kubernetes.io/projected/5c682e8b-b3e9-467e-87c7-bea53b47a5e6-kube-api-access-zgq67\") pod \"s3-init-54n22\" (UID: \"5c682e8b-b3e9-467e-87c7-bea53b47a5e6\") " pod="kserve/s3-init-54n22" Mar 18 16:56:11.047684 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:11.047654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgq67\" (UniqueName: \"kubernetes.io/projected/5c682e8b-b3e9-467e-87c7-bea53b47a5e6-kube-api-access-zgq67\") pod \"s3-init-54n22\" (UID: \"5c682e8b-b3e9-467e-87c7-bea53b47a5e6\") " pod="kserve/s3-init-54n22" Mar 18 16:56:11.055556 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:11.055534 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgq67\" (UniqueName: \"kubernetes.io/projected/5c682e8b-b3e9-467e-87c7-bea53b47a5e6-kube-api-access-zgq67\") pod \"s3-init-54n22\" (UID: \"5c682e8b-b3e9-467e-87c7-bea53b47a5e6\") " pod="kserve/s3-init-54n22" Mar 18 16:56:11.176451 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:11.176419 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-54n22" Mar 18 16:56:11.304338 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:11.304295 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-54n22"] Mar 18 16:56:11.307251 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:56:11.307221 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c682e8b_b3e9_467e_87c7_bea53b47a5e6.slice/crio-2a7f7eeab07a8d5911ec016658182448a3411e2e4dbb67ad596db2989036c0fb WatchSource:0}: Error finding container 2a7f7eeab07a8d5911ec016658182448a3411e2e4dbb67ad596db2989036c0fb: Status 404 returned error can't find the container with id 2a7f7eeab07a8d5911ec016658182448a3411e2e4dbb67ad596db2989036c0fb Mar 18 16:56:11.309038 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:11.309020 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:56:12.008879 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:12.008832 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-54n22" event={"ID":"5c682e8b-b3e9-467e-87c7-bea53b47a5e6","Type":"ContainerStarted","Data":"2a7f7eeab07a8d5911ec016658182448a3411e2e4dbb67ad596db2989036c0fb"} Mar 18 16:56:16.023536 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:16.023447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-54n22" event={"ID":"5c682e8b-b3e9-467e-87c7-bea53b47a5e6","Type":"ContainerStarted","Data":"1d0a405cf98e012a185a119d30a75bbf8d124e39d7c1206405b5ff49081c8723"} Mar 18 16:56:16.039144 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:16.039080 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-54n22" podStartSLOduration=1.670896167 podStartE2EDuration="6.039064055s" podCreationTimestamp="2026-03-18 16:56:10 +0000 UTC" firstStartedPulling="2026-03-18 16:56:11.309171321 +0000 UTC m=+693.893945324" lastFinishedPulling="2026-03-18 16:56:15.6773392 +0000 UTC m=+698.262113212" observedRunningTime="2026-03-18 16:56:16.037580061 +0000 UTC m=+698.622354088" watchObservedRunningTime="2026-03-18 16:56:16.039064055 +0000 UTC m=+698.623838082" Mar 18 16:56:19.033659 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:19.033614 2578 generic.go:358] "Generic (PLEG): container finished" podID="5c682e8b-b3e9-467e-87c7-bea53b47a5e6" containerID="1d0a405cf98e012a185a119d30a75bbf8d124e39d7c1206405b5ff49081c8723" exitCode=0 Mar 18 16:56:19.034042 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:19.033690 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-54n22" event={"ID":"5c682e8b-b3e9-467e-87c7-bea53b47a5e6","Type":"ContainerDied","Data":"1d0a405cf98e012a185a119d30a75bbf8d124e39d7c1206405b5ff49081c8723"} Mar 18 16:56:20.168513 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:20.168493 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-54n22" Mar 18 16:56:20.228608 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:20.228578 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgq67\" (UniqueName: \"kubernetes.io/projected/5c682e8b-b3e9-467e-87c7-bea53b47a5e6-kube-api-access-zgq67\") pod \"5c682e8b-b3e9-467e-87c7-bea53b47a5e6\" (UID: \"5c682e8b-b3e9-467e-87c7-bea53b47a5e6\") " Mar 18 16:56:20.230880 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:20.230849 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c682e8b-b3e9-467e-87c7-bea53b47a5e6-kube-api-access-zgq67" (OuterVolumeSpecName: "kube-api-access-zgq67") pod "5c682e8b-b3e9-467e-87c7-bea53b47a5e6" (UID: "5c682e8b-b3e9-467e-87c7-bea53b47a5e6"). InnerVolumeSpecName "kube-api-access-zgq67". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:56:20.330154 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:20.330049 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zgq67\" (UniqueName: \"kubernetes.io/projected/5c682e8b-b3e9-467e-87c7-bea53b47a5e6-kube-api-access-zgq67\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:56:21.041260 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:21.041224 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-54n22" event={"ID":"5c682e8b-b3e9-467e-87c7-bea53b47a5e6","Type":"ContainerDied","Data":"2a7f7eeab07a8d5911ec016658182448a3411e2e4dbb67ad596db2989036c0fb"} Mar 18 16:56:21.041260 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:21.041246 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-54n22" Mar 18 16:56:21.041260 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:56:21.041257 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a7f7eeab07a8d5911ec016658182448a3411e2e4dbb67ad596db2989036c0fb" Mar 18 16:57:26.228264 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.228230 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-h44fs/must-gather-8wgd9"] Mar 18 16:57:26.228655 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.228545 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c682e8b-b3e9-467e-87c7-bea53b47a5e6" containerName="s3-init" Mar 18 16:57:26.228655 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.228556 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c682e8b-b3e9-467e-87c7-bea53b47a5e6" containerName="s3-init" Mar 18 16:57:26.228655 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.228612 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c682e8b-b3e9-467e-87c7-bea53b47a5e6" containerName="s3-init" Mar 18 16:57:26.231459 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.231441 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h44fs/must-gather-8wgd9" Mar 18 16:57:26.233379 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.233360 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h44fs\"/\"openshift-service-ca.crt\"" Mar 18 16:57:26.233465 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.233358 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-h44fs\"/\"kube-root-ca.crt\"" Mar 18 16:57:26.233846 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.233832 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-h44fs\"/\"default-dockercfg-689xr\"" Mar 18 16:57:26.237705 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.237682 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h44fs/must-gather-8wgd9"] Mar 18 16:57:26.339249 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.339200 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nl9c\" (UniqueName: \"kubernetes.io/projected/23707746-3844-4eea-a268-dda75637d2c7-kube-api-access-2nl9c\") pod \"must-gather-8wgd9\" (UID: \"23707746-3844-4eea-a268-dda75637d2c7\") " pod="openshift-must-gather-h44fs/must-gather-8wgd9" Mar 18 16:57:26.339249 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.339257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23707746-3844-4eea-a268-dda75637d2c7-must-gather-output\") pod \"must-gather-8wgd9\" (UID: \"23707746-3844-4eea-a268-dda75637d2c7\") " pod="openshift-must-gather-h44fs/must-gather-8wgd9" Mar 18 16:57:26.440263 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.440229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nl9c\" (UniqueName: \"kubernetes.io/projected/23707746-3844-4eea-a268-dda75637d2c7-kube-api-access-2nl9c\") pod \"must-gather-8wgd9\" (UID: \"23707746-3844-4eea-a268-dda75637d2c7\") " pod="openshift-must-gather-h44fs/must-gather-8wgd9" Mar 18 16:57:26.440263 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.440268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23707746-3844-4eea-a268-dda75637d2c7-must-gather-output\") pod \"must-gather-8wgd9\" (UID: \"23707746-3844-4eea-a268-dda75637d2c7\") " pod="openshift-must-gather-h44fs/must-gather-8wgd9" Mar 18 16:57:26.440548 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.440533 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23707746-3844-4eea-a268-dda75637d2c7-must-gather-output\") pod \"must-gather-8wgd9\" (UID: \"23707746-3844-4eea-a268-dda75637d2c7\") " pod="openshift-must-gather-h44fs/must-gather-8wgd9" Mar 18 16:57:26.449626 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.449601 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nl9c\" (UniqueName: \"kubernetes.io/projected/23707746-3844-4eea-a268-dda75637d2c7-kube-api-access-2nl9c\") pod \"must-gather-8wgd9\" (UID: \"23707746-3844-4eea-a268-dda75637d2c7\") " pod="openshift-must-gather-h44fs/must-gather-8wgd9" Mar 18 16:57:26.541370 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.541314 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h44fs/must-gather-8wgd9" Mar 18 16:57:26.664876 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:26.664840 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-h44fs/must-gather-8wgd9"] Mar 18 16:57:26.667869 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:57:26.667837 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23707746_3844_4eea_a268_dda75637d2c7.slice/crio-9559d00df137493440fb0ee9ad82dc8788fcaf308c018cca60c803ba2ac65b9b WatchSource:0}: Error finding container 9559d00df137493440fb0ee9ad82dc8788fcaf308c018cca60c803ba2ac65b9b: Status 404 returned error can't find the container with id 9559d00df137493440fb0ee9ad82dc8788fcaf308c018cca60c803ba2ac65b9b Mar 18 16:57:27.227571 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:27.227515 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h44fs/must-gather-8wgd9" event={"ID":"23707746-3844-4eea-a268-dda75637d2c7","Type":"ContainerStarted","Data":"9559d00df137493440fb0ee9ad82dc8788fcaf308c018cca60c803ba2ac65b9b"} Mar 18 16:57:31.243458 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:31.243414 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h44fs/must-gather-8wgd9" event={"ID":"23707746-3844-4eea-a268-dda75637d2c7","Type":"ContainerStarted","Data":"a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7"} Mar 18 16:57:32.250406 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:32.250360 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h44fs/must-gather-8wgd9" event={"ID":"23707746-3844-4eea-a268-dda75637d2c7","Type":"ContainerStarted","Data":"a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02"} Mar 18 16:57:32.267397 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:32.267337 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-h44fs/must-gather-8wgd9" podStartSLOduration=1.8362788110000001 podStartE2EDuration="6.267319489s" podCreationTimestamp="2026-03-18 16:57:26 +0000 UTC" firstStartedPulling="2026-03-18 16:57:26.669834491 +0000 UTC m=+769.254608494" lastFinishedPulling="2026-03-18 16:57:31.100875165 +0000 UTC m=+773.685649172" observedRunningTime="2026-03-18 16:57:32.264913056 +0000 UTC m=+774.849687084" watchObservedRunningTime="2026-03-18 16:57:32.267319489 +0000 UTC m=+774.852093515" Mar 18 16:57:50.667066 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:50.667037 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-hmt7v_39d41b2c-a876-4347-9f78-5b750630617b/limitador/0.log" Mar 18 16:57:51.440574 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:51.440537 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-hmt7v_39d41b2c-a876-4347-9f78-5b750630617b/limitador/0.log" Mar 18 16:57:52.194840 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:52.194814 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-hmt7v_39d41b2c-a876-4347-9f78-5b750630617b/limitador/0.log" Mar 18 16:57:52.915210 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:52.915181 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-hmt7v_39d41b2c-a876-4347-9f78-5b750630617b/limitador/0.log" Mar 18 16:57:53.654502 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:53.654473 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-hmt7v_39d41b2c-a876-4347-9f78-5b750630617b/limitador/0.log" Mar 18 16:57:54.317953 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:54.317922 2578 generic.go:358] "Generic (PLEG): container finished" podID="23707746-3844-4eea-a268-dda75637d2c7" containerID="a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7" exitCode=0 Mar 18 16:57:54.318166 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:54.317961 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-h44fs/must-gather-8wgd9" event={"ID":"23707746-3844-4eea-a268-dda75637d2c7","Type":"ContainerDied","Data":"a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7"} Mar 18 16:57:54.318324 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:54.318307 2578 scope.go:117] "RemoveContainer" containerID="a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7" Mar 18 16:57:55.179448 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.179407 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h44fs_must-gather-8wgd9_23707746-3844-4eea-a268-dda75637d2c7/gather/0.log" Mar 18 16:57:55.687676 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.687644 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zsmk9/must-gather-b9tr7"] Mar 18 16:57:55.692118 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.692064 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zsmk9/must-gather-b9tr7" Mar 18 16:57:55.694020 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.693999 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zsmk9\"/\"openshift-service-ca.crt\"" Mar 18 16:57:55.694481 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.694466 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zsmk9\"/\"default-dockercfg-vmgwf\"" Mar 18 16:57:55.694537 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.694469 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zsmk9\"/\"kube-root-ca.crt\"" Mar 18 16:57:55.696939 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.696909 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zsmk9/must-gather-b9tr7"] Mar 18 16:57:55.796533 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.796508 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/85ca087d-8d3a-4eed-86af-9a5bd70133e7-must-gather-output\") pod \"must-gather-b9tr7\" (UID: \"85ca087d-8d3a-4eed-86af-9a5bd70133e7\") " pod="openshift-must-gather-zsmk9/must-gather-b9tr7" Mar 18 16:57:55.796676 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.796558 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6ljz\" (UniqueName: \"kubernetes.io/projected/85ca087d-8d3a-4eed-86af-9a5bd70133e7-kube-api-access-j6ljz\") pod \"must-gather-b9tr7\" (UID: \"85ca087d-8d3a-4eed-86af-9a5bd70133e7\") " pod="openshift-must-gather-zsmk9/must-gather-b9tr7" Mar 18 16:57:55.897124 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.897083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/85ca087d-8d3a-4eed-86af-9a5bd70133e7-must-gather-output\") pod \"must-gather-b9tr7\" (UID: \"85ca087d-8d3a-4eed-86af-9a5bd70133e7\") " pod="openshift-must-gather-zsmk9/must-gather-b9tr7" Mar 18 16:57:55.897237 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.897153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6ljz\" (UniqueName: \"kubernetes.io/projected/85ca087d-8d3a-4eed-86af-9a5bd70133e7-kube-api-access-j6ljz\") pod \"must-gather-b9tr7\" (UID: \"85ca087d-8d3a-4eed-86af-9a5bd70133e7\") " pod="openshift-must-gather-zsmk9/must-gather-b9tr7" Mar 18 16:57:55.897399 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.897383 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/85ca087d-8d3a-4eed-86af-9a5bd70133e7-must-gather-output\") pod \"must-gather-b9tr7\" (UID: \"85ca087d-8d3a-4eed-86af-9a5bd70133e7\") " pod="openshift-must-gather-zsmk9/must-gather-b9tr7" Mar 18 16:57:55.905168 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:55.905117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6ljz\" (UniqueName: \"kubernetes.io/projected/85ca087d-8d3a-4eed-86af-9a5bd70133e7-kube-api-access-j6ljz\") pod \"must-gather-b9tr7\" (UID: \"85ca087d-8d3a-4eed-86af-9a5bd70133e7\") " pod="openshift-must-gather-zsmk9/must-gather-b9tr7" Mar 18 16:57:56.001273 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:56.001218 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zsmk9/must-gather-b9tr7" Mar 18 16:57:56.119675 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:56.119634 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zsmk9/must-gather-b9tr7"] Mar 18 16:57:56.122955 ip-10-0-130-255 kubenswrapper[2578]: W0318 16:57:56.122928 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85ca087d_8d3a_4eed_86af_9a5bd70133e7.slice/crio-351e801ee57786cba5dad76fe375c5caf5e6d68ade67ea791b26ecd0172aff1a WatchSource:0}: Error finding container 351e801ee57786cba5dad76fe375c5caf5e6d68ade67ea791b26ecd0172aff1a: Status 404 returned error can't find the container with id 351e801ee57786cba5dad76fe375c5caf5e6d68ade67ea791b26ecd0172aff1a Mar 18 16:57:56.324139 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:56.324040 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zsmk9/must-gather-b9tr7" event={"ID":"85ca087d-8d3a-4eed-86af-9a5bd70133e7","Type":"ContainerStarted","Data":"351e801ee57786cba5dad76fe375c5caf5e6d68ade67ea791b26ecd0172aff1a"} Mar 18 16:57:57.330901 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:57.330822 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zsmk9/must-gather-b9tr7" event={"ID":"85ca087d-8d3a-4eed-86af-9a5bd70133e7","Type":"ContainerStarted","Data":"a261ab3568a45d1f4d48c9b322c48bebd4fbd62f535e60137d9aa677af5d0f95"} Mar 18 16:57:57.330901 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:57.330864 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zsmk9/must-gather-b9tr7" event={"ID":"85ca087d-8d3a-4eed-86af-9a5bd70133e7","Type":"ContainerStarted","Data":"74c8a140b3df65f18241b1e6fd56964c396e9b21c11e218c16fb86f9cb70491d"} Mar 18 16:57:57.347877 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:57.347818 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zsmk9/must-gather-b9tr7" podStartSLOduration=1.530432164 podStartE2EDuration="2.347798642s" podCreationTimestamp="2026-03-18 16:57:55 +0000 UTC" firstStartedPulling="2026-03-18 16:57:56.124889711 +0000 UTC m=+798.709663715" lastFinishedPulling="2026-03-18 16:57:56.942256178 +0000 UTC m=+799.527030193" observedRunningTime="2026-03-18 16:57:57.34582892 +0000 UTC m=+799.930602950" watchObservedRunningTime="2026-03-18 16:57:57.347798642 +0000 UTC m=+799.932572669" Mar 18 16:57:58.492639 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:58.492614 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-q8vrp_48b70cf4-1041-40e2-aac2-2997c4e8585d/global-pull-secret-syncer/0.log" Mar 18 16:57:58.608596 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:58.608561 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ttdbk_245b5eb7-bf78-4c89-8b17-d9e75f2f63d8/konnectivity-agent/0.log" Mar 18 16:57:58.629675 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:57:58.629634 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-255.ec2.internal_576a374e63f549f919ff7ffbf00e5e30/haproxy/0.log" Mar 18 16:58:00.534969 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:00.534918 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-h44fs/must-gather-8wgd9"] Mar 18 16:58:00.535456 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:00.535227 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-h44fs/must-gather-8wgd9" podUID="23707746-3844-4eea-a268-dda75637d2c7" containerName="copy" containerID="cri-o://a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02" gracePeriod=2 Mar 18 16:58:00.537189 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:00.537142 2578 status_manager.go:895] "Failed to get status for pod" podUID="23707746-3844-4eea-a268-dda75637d2c7" pod="openshift-must-gather-h44fs/must-gather-8wgd9" err="pods \"must-gather-8wgd9\" is forbidden: User \"system:node:ip-10-0-130-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-h44fs\": no relationship found between node 'ip-10-0-130-255.ec2.internal' and this object" Mar 18 16:58:00.539520 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:00.539463 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-h44fs/must-gather-8wgd9"] Mar 18 16:58:00.996126 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:00.993110 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h44fs_must-gather-8wgd9_23707746-3844-4eea-a268-dda75637d2c7/copy/0.log" Mar 18 16:58:00.996126 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:00.993571 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h44fs/must-gather-8wgd9" Mar 18 16:58:00.998347 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:00.998298 2578 status_manager.go:895] "Failed to get status for pod" podUID="23707746-3844-4eea-a268-dda75637d2c7" pod="openshift-must-gather-h44fs/must-gather-8wgd9" err="pods \"must-gather-8wgd9\" is forbidden: User \"system:node:ip-10-0-130-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-h44fs\": no relationship found between node 'ip-10-0-130-255.ec2.internal' and this object" Mar 18 16:58:01.053173 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.053118 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nl9c\" (UniqueName: \"kubernetes.io/projected/23707746-3844-4eea-a268-dda75637d2c7-kube-api-access-2nl9c\") pod \"23707746-3844-4eea-a268-dda75637d2c7\" (UID: \"23707746-3844-4eea-a268-dda75637d2c7\") " Mar 18 16:58:01.053367 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.053221 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23707746-3844-4eea-a268-dda75637d2c7-must-gather-output\") pod \"23707746-3844-4eea-a268-dda75637d2c7\" (UID: \"23707746-3844-4eea-a268-dda75637d2c7\") " Mar 18 16:58:01.065156 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.060505 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23707746-3844-4eea-a268-dda75637d2c7-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "23707746-3844-4eea-a268-dda75637d2c7" (UID: "23707746-3844-4eea-a268-dda75637d2c7"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:58:01.065564 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.065508 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23707746-3844-4eea-a268-dda75637d2c7-kube-api-access-2nl9c" (OuterVolumeSpecName: "kube-api-access-2nl9c") pod "23707746-3844-4eea-a268-dda75637d2c7" (UID: "23707746-3844-4eea-a268-dda75637d2c7"). InnerVolumeSpecName "kube-api-access-2nl9c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:58:01.154199 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.154132 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2nl9c\" (UniqueName: \"kubernetes.io/projected/23707746-3844-4eea-a268-dda75637d2c7-kube-api-access-2nl9c\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:58:01.154199 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.154165 2578 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/23707746-3844-4eea-a268-dda75637d2c7-must-gather-output\") on node \"ip-10-0-130-255.ec2.internal\" DevicePath \"\"" Mar 18 16:58:01.351204 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.351162 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-h44fs_must-gather-8wgd9_23707746-3844-4eea-a268-dda75637d2c7/copy/0.log" Mar 18 16:58:01.352008 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.351980 2578 generic.go:358] "Generic (PLEG): container finished" podID="23707746-3844-4eea-a268-dda75637d2c7" containerID="a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02" exitCode=143 Mar 18 16:58:01.352247 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.352216 2578 scope.go:117] "RemoveContainer" containerID="a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02" Mar 18 16:58:01.352634 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.352616 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-h44fs/must-gather-8wgd9" Mar 18 16:58:01.355786 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.355374 2578 status_manager.go:895] "Failed to get status for pod" podUID="23707746-3844-4eea-a268-dda75637d2c7" pod="openshift-must-gather-h44fs/must-gather-8wgd9" err="pods \"must-gather-8wgd9\" is forbidden: User \"system:node:ip-10-0-130-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-h44fs\": no relationship found between node 'ip-10-0-130-255.ec2.internal' and this object" Mar 18 16:58:01.370935 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.370427 2578 scope.go:117] "RemoveContainer" containerID="a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7" Mar 18 16:58:01.373016 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.372972 2578 status_manager.go:895] "Failed to get status for pod" podUID="23707746-3844-4eea-a268-dda75637d2c7" pod="openshift-must-gather-h44fs/must-gather-8wgd9" err="pods \"must-gather-8wgd9\" is forbidden: User \"system:node:ip-10-0-130-255.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-h44fs\": no relationship found between node 'ip-10-0-130-255.ec2.internal' and this object" Mar 18 16:58:01.430428 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.430397 2578 scope.go:117] "RemoveContainer" containerID="a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02" Mar 18 16:58:01.430938 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:58:01.430777 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02\": container with ID starting with a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02 not found: ID does not exist" containerID="a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02" Mar 18 16:58:01.430938 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.430827 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02"} err="failed to get container status \"a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02\": rpc error: code = NotFound desc = could not find container \"a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02\": container with ID starting with a04cff384bb7da218fd8a08e6803517ff480dbce093f24f0156fc05138796e02 not found: ID does not exist" Mar 18 16:58:01.430938 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.430853 2578 scope.go:117] "RemoveContainer" containerID="a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7" Mar 18 16:58:01.431651 ip-10-0-130-255 kubenswrapper[2578]: E0318 16:58:01.431624 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7\": container with ID starting with a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7 not found: ID does not exist" containerID="a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7" Mar 18 16:58:01.431803 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.431658 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7"} err="failed to get container status \"a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7\": rpc error: code = NotFound desc = could not find container \"a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7\": container with ID starting with a259d2f3abc3cf54272ad2f80d424a8e983d427a364834726d58adf15c43c8e7 not found: ID does not exist" Mar 18 16:58:01.998711 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:01.998675 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23707746-3844-4eea-a268-dda75637d2c7" path="/var/lib/kubelet/pods/23707746-3844-4eea-a268-dda75637d2c7/volumes" Mar 18 16:58:02.902862 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:02.902833 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-hmt7v_39d41b2c-a876-4347-9f78-5b750630617b/limitador/0.log" Mar 18 16:58:04.052834 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:04.052805 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-jh7zd_73e43277-038f-4657-95b6-addae5fb597c/cluster-monitoring-operator/1.log" Mar 18 16:58:04.164627 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:04.164331 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-jh7zd_73e43277-038f-4657-95b6-addae5fb597c/cluster-monitoring-operator/0.log" Mar 18 16:58:04.274135 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:04.274104 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-675c6476b5-zq6kd_cf1aad5a-c827-45f3-8bd0-35b26f6885fc/metrics-server/0.log" Mar 18 16:58:04.304254 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:04.304186 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-6d47bdb78d-km458_55549973-6078-4e9b-a42a-75ae3ea9c602/monitoring-plugin/0.log" Mar 18 16:58:04.417413 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:04.417388 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jzgqd_d8b3bdea-0c2e-4fc9-a142-180b0451b57b/node-exporter/0.log" Mar 18 16:58:04.442935 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:04.442850 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jzgqd_d8b3bdea-0c2e-4fc9-a142-180b0451b57b/kube-rbac-proxy/0.log" Mar 18 16:58:04.468242 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:04.468214 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jzgqd_d8b3bdea-0c2e-4fc9-a142-180b0451b57b/init-textfile/0.log" Mar 18 16:58:04.957385 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:04.957350 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-656bbb49d4-xw74z_4df74c11-4f50-4513-944d-19379b1f4184/telemeter-client/0.log" Mar 18 16:58:04.979983 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:04.979944 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-656bbb49d4-xw74z_4df74c11-4f50-4513-944d-19379b1f4184/reload/0.log" Mar 18 16:58:05.006451 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:05.006421 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-656bbb49d4-xw74z_4df74c11-4f50-4513-944d-19379b1f4184/kube-rbac-proxy/0.log" Mar 18 16:58:07.265376 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.265347 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8bbb5d4bc-s8bsb_ac9c969c-4d19-4316-8308-37d777523623/console/0.log" Mar 18 16:58:07.346237 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.346203 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r"] Mar 18 16:58:07.346562 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.346546 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23707746-3844-4eea-a268-dda75637d2c7" containerName="gather" Mar 18 16:58:07.346614 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.346564 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="23707746-3844-4eea-a268-dda75637d2c7" containerName="gather" Mar 18 16:58:07.346614 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.346576 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23707746-3844-4eea-a268-dda75637d2c7" containerName="copy" Mar 18 16:58:07.346614 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.346582 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="23707746-3844-4eea-a268-dda75637d2c7" containerName="copy" Mar 18 16:58:07.346708 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.346639 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="23707746-3844-4eea-a268-dda75637d2c7" containerName="gather" Mar 18 16:58:07.346708 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.346647 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="23707746-3844-4eea-a268-dda75637d2c7" containerName="copy" Mar 18 16:58:07.350418 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.350393 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.356520 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.356490 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r"] Mar 18 16:58:07.412801 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.412764 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-proc\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.412957 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.412822 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qpmh\" (UniqueName: \"kubernetes.io/projected/e9a25ebd-5093-4297-ab83-13ec2b79984d-kube-api-access-4qpmh\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.412957 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.412845 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-lib-modules\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.413066 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.412974 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-sys\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.413066 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.413010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-podres\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.514207 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.514172 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-sys\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.514606 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.514585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-podres\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.514766 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.514314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-sys\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.514942 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.514745 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-podres\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.515049 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.514791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-proc\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.515210 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.515196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qpmh\" (UniqueName: \"kubernetes.io/projected/e9a25ebd-5093-4297-ab83-13ec2b79984d-kube-api-access-4qpmh\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.515548 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.515488 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-lib-modules\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.515675 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.515647 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-lib-modules\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.515758 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.514849 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e9a25ebd-5093-4297-ab83-13ec2b79984d-proc\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.523056 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.523033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qpmh\" (UniqueName: \"kubernetes.io/projected/e9a25ebd-5093-4297-ab83-13ec2b79984d-kube-api-access-4qpmh\") pod \"perf-node-gather-daemonset-lcr5r\" (UID: \"e9a25ebd-5093-4297-ab83-13ec2b79984d\") " pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.664265 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.664230 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:07.753507 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.753476 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-67fdcb5769-wrk6n_5ce47c73-a640-42a6-87a2-d2c7e5f304e2/volume-data-source-validator/0.log" Mar 18 16:58:07.800693 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:07.800612 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r"] Mar 18 16:58:08.386647 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:08.386610 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" event={"ID":"e9a25ebd-5093-4297-ab83-13ec2b79984d","Type":"ContainerStarted","Data":"41f452f99fb94310c542167ff064323b57446cdaad21b4dfdff5d8e492b9ddd9"} Mar 18 16:58:08.386647 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:08.386652 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" event={"ID":"e9a25ebd-5093-4297-ab83-13ec2b79984d","Type":"ContainerStarted","Data":"12650fc8471825b1440901b9f4b708c2ceb8a7632a31fe64a41512e7f6097eb5"} Mar 18 16:58:08.389025 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:08.388749 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:08.408206 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:08.408162 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" podStartSLOduration=1.408147063 podStartE2EDuration="1.408147063s" podCreationTimestamp="2026-03-18 16:58:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:58:08.406288385 +0000 UTC m=+810.991062413" watchObservedRunningTime="2026-03-18 16:58:08.408147063 +0000 UTC m=+810.992921088" Mar 18 16:58:08.584603 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:08.584580 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pb5h9_0690d7d0-de95-4cec-9e24-53b54d9b232d/dns/0.log" Mar 18 16:58:08.607943 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:08.607918 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pb5h9_0690d7d0-de95-4cec-9e24-53b54d9b232d/kube-rbac-proxy/0.log" Mar 18 16:58:08.678520 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:08.678498 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-tvxnf_e48c47f8-3be3-4bee-bab5-5a2d007486f8/dns-node-resolver/0.log" Mar 18 16:58:09.181240 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:09.181216 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-x499v_bbb42fee-4a86-4fbc-b701-d582b093b57a/node-ca/0.log" Mar 18 16:58:10.490683 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:10.490656 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6f4h4_0972e2c0-041f-46c3-8440-60ac3028c22d/serve-healthcheck-canary/0.log" Mar 18 16:58:10.973114 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:10.973068 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-76bdd9f478-bv4qv_143d5104-122a-4bd9-ac1e-35fce758029a/insights-operator/1.log" Mar 18 16:58:10.973741 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:10.973720 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-76bdd9f478-bv4qv_143d5104-122a-4bd9-ac1e-35fce758029a/insights-operator/0.log" Mar 18 16:58:10.994640 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:10.994622 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4bt94_bab677fb-ab67-4c58-9ff4-1e1a862ff304/kube-rbac-proxy/0.log" Mar 18 16:58:11.018200 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:11.018178 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4bt94_bab677fb-ab67-4c58-9ff4-1e1a862ff304/exporter/0.log" Mar 18 16:58:11.041348 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:11.041325 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4bt94_bab677fb-ab67-4c58-9ff4-1e1a862ff304/extractor/0.log" Mar 18 16:58:13.649649 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:13.649616 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-84978b767b-zgqtj_56681957-3dd3-4e13-8afe-1db309b51d33/manager/0.log" Mar 18 16:58:14.306516 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:14.306492 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-54n22_5c682e8b-b3e9-467e-87c7-bea53b47a5e6/s3-init/0.log" Mar 18 16:58:14.402820 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:14.402789 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zsmk9/perf-node-gather-daemonset-lcr5r" Mar 18 16:58:18.945288 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:18.945218 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-4d8qv_bd635b95-ab3b-4d16-951c-a3b3f97a8996/migrator/0.log" Mar 18 16:58:18.970748 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:18.970725 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-4d8qv_bd635b95-ab3b-4d16-951c-a3b3f97a8996/graceful-termination/0.log" Mar 18 16:58:20.334432 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:20.334402 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-64cg4_cc27bc09-bb46-4e2c-878a-fbd2388a8177/kube-multus/0.log" Mar 18 16:58:20.549024 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:20.548993 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d78r7_c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf/kube-multus-additional-cni-plugins/0.log" Mar 18 16:58:20.578055 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:20.578029 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d78r7_c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf/egress-router-binary-copy/0.log" Mar 18 16:58:20.603437 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:20.603386 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d78r7_c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf/cni-plugins/0.log" Mar 18 16:58:20.631924 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:20.631905 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d78r7_c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf/bond-cni-plugin/0.log" Mar 18 16:58:20.658434 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:20.658413 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d78r7_c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf/routeoverride-cni/0.log" Mar 18 16:58:20.682694 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:20.682676 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d78r7_c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf/whereabouts-cni-bincopy/0.log" Mar 18 16:58:20.707285 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:20.707268 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d78r7_c43e0b46-24aa-44cd-bf3f-f8e27b1abbdf/whereabouts-cni/0.log" Mar 18 16:58:20.933075 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:20.933044 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jcvjp_474e4d0f-dbfc-41a4-ad8f-fcada6a1b880/network-metrics-daemon/0.log" Mar 18 16:58:20.958253 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:20.958229 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jcvjp_474e4d0f-dbfc-41a4-ad8f-fcada6a1b880/kube-rbac-proxy/0.log" Mar 18 16:58:21.874991 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:21.874966 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-controller/0.log" Mar 18 16:58:21.896298 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:21.896261 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-acl-logging/0.log" Mar 18 16:58:21.899944 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:21.899922 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovn-acl-logging/1.log" Mar 18 16:58:21.920595 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:21.920573 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/kube-rbac-proxy-node/0.log" Mar 18 16:58:21.943305 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:21.943282 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 16:58:21.965188 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:21.965162 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/northd/0.log" Mar 18 16:58:21.988334 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:21.988316 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/nbdb/0.log" Mar 18 16:58:22.015732 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:22.015707 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/sbdb/0.log" Mar 18 16:58:22.131299 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:22.131226 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hjd4v_d1ec73fb-e596-4c69-abc5-b3073ed73133/ovnkube-controller/0.log" Mar 18 16:58:23.869979 ip-10-0-130-255 kubenswrapper[2578]: I0318 16:58:23.869915 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-b5628_4e172408-4d26-4b03-a0eb-bfcb801cdadc/network-check-target-container/0.log"