Apr 22 18:33:46.389002 ip-10-0-133-29 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:33:46.389014 ip-10-0-133-29 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:33:46.389023 ip-10-0-133-29 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:33:46.389322 ip-10-0-133-29 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:33:56.573696 ip-10-0-133-29 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:33:56.573713 ip-10-0-133-29 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 3309674f64be420cb3748499caf3bd2f -- Apr 22 18:36:07.317508 ip-10-0-133-29 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:36:07.724245 ip-10-0-133-29 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:07.724245 ip-10-0-133-29 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:36:07.724245 ip-10-0-133-29 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:07.724245 ip-10-0-133-29 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:36:07.724245 ip-10-0-133-29 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:07.725984 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.725891 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:36:07.732451 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732419 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:07.732567 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732556 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:07.732567 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732568 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732573 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732577 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732582 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732586 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732592 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732597 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732601 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732606 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732610 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732614 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732618 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732622 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732626 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732630 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732635 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732639 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732643 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732647 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732651 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:07.732666 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732659 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732664 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732668 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732673 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732677 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732681 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732685 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732689 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732693 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732700 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732707 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732712 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732716 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732720 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732724 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732728 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732732 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732736 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732740 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732744 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:07.733482 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732748 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732752 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732756 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732761 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732765 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732769 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732772 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732777 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732784 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732788 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732792 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732796 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732801 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732805 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732812 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732819 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732824 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732829 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732834 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:07.733971 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732839 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732843 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732848 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732853 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732857 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732862 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732866 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732870 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732874 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732881 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732885 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732889 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732893 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732897 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732901 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732905 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732910 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732916 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732920 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732924 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:07.734494 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732929 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:07.735074 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732932 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:07.735074 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732936 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:07.735074 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732940 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:07.735074 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.732945 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735088 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735099 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735103 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735108 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735113 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735117 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735122 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735127 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735133 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735138 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735142 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735147 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735151 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735155 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735159 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735163 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735167 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735170 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735174 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:07.735280 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735178 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735182 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735186 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735190 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735194 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735200 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735204 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735208 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735215 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735221 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735244 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735249 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735253 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735257 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735262 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735267 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735272 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735276 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735280 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:07.736170 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735284 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735288 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735292 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735297 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735301 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735305 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735309 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735313 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735317 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735321 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735325 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735330 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735334 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735338 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735342 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735346 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735350 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735354 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735358 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735364 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:07.737014 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735368 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735372 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735376 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735381 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735385 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735389 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735393 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735399 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735403 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735407 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735411 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735415 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735419 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735423 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735427 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735432 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735436 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735440 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735444 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735448 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:07.737548 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735452 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735456 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735462 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735466 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735472 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735478 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735483 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.735488 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735591 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735602 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735613 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735619 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735626 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735632 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735640 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735647 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735651 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735656 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735662 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735668 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735673 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735678 2577 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:36:07.738130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735683 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735688 2577 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735693 2577 flags.go:64] FLAG: --cloud-config="" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735698 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735702 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735708 2577 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735713 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735718 2577 flags.go:64] FLAG: --config-dir="" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735723 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735728 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735735 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735740 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735745 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735750 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735754 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735759 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735764 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735769 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735774 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735780 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735785 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735790 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735794 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735800 2577 flags.go:64] FLAG: --enable-server="true" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735805 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:36:07.738858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735812 2577 flags.go:64] FLAG: --event-burst="100" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735817 2577 flags.go:64] FLAG: --event-qps="50" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735821 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735826 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735832 2577 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735838 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735844 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735849 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735855 2577 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735860 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735866 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735870 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735875 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735880 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735884 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735889 2577 flags.go:64] FLAG: --feature-gates="" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735895 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735900 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735906 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735911 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735917 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735922 2577 flags.go:64] FLAG: --help="false" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735926 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-133-29.ec2.internal" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735931 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:36:07.739574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735936 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735941 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735947 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735952 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735957 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735962 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735967 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735972 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735977 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735983 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735988 2577 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735992 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.735997 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736002 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736007 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736012 2577 flags.go:64] FLAG: --lock-file="" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736016 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736021 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736026 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736041 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736046 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736051 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736056 2577 flags.go:64] FLAG: --logging-format="text" Apr 22 18:36:07.740146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736061 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736066 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736071 2577 flags.go:64] FLAG: --manifest-url="" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736075 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736082 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736087 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736093 2577 flags.go:64] FLAG: --max-pods="110" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736099 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736104 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736109 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736113 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736118 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736123 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736128 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736140 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736145 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736150 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736156 2577 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736161 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736170 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736174 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736179 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736184 2577 flags.go:64] FLAG: --port="10250" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736188 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:36:07.740808 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736193 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05705a0d93123116a" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736199 2577 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736204 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736209 2577 flags.go:64] FLAG: --register-node="true" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736214 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736221 2577 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736244 2577 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736249 2577 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736254 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736258 2577 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736264 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736269 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736274 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736279 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736284 2577 flags.go:64] FLAG: --runonce="false" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736289 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736294 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736299 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736303 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736308 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736313 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736318 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736323 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736327 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736332 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736337 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:36:07.741407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736343 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736348 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736353 2577 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736357 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736366 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736371 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736375 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736382 2577 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736386 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736391 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736396 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736403 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736408 2577 flags.go:64] FLAG: --v="2" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736415 2577 flags.go:64] FLAG: --version="false" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736421 2577 flags.go:64] FLAG: --vmodule="" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736428 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.736433 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736583 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736589 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736593 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736598 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736602 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736606 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:07.742088 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736610 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736614 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736618 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736622 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736626 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736630 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736634 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736638 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736642 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736646 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736651 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736656 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736660 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736667 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736671 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736675 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736679 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736684 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736688 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736692 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:07.742680 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736702 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736707 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736711 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736715 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736720 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736724 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736728 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736732 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736737 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736741 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736745 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736749 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736753 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736757 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736761 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736766 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736770 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736774 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736778 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:07.743276 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736783 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736787 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736791 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736795 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736800 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736804 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736810 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736813 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736818 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736822 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736826 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736830 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736834 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736842 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736848 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736853 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736859 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736865 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736870 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:07.743807 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736875 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736880 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736884 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736888 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736892 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736897 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736901 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736906 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736910 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736913 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736917 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736922 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736926 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736930 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736934 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736938 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736942 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736947 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736952 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736958 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:07.744297 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736962 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:07.744798 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.736966 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:07.744798 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.737693 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:07.744859 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.744712 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:36:07.744859 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.744812 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744860 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744866 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744869 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744873 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744875 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744878 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744882 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744884 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744887 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744890 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744893 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744895 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744898 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744901 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744903 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744906 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744909 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744911 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744914 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:07.744911 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744917 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744920 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744923 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744926 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744928 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744931 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744934 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744937 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744940 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744942 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744945 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744947 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744949 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744952 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744954 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744957 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744959 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744961 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744964 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:07.745505 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744968 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744972 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744975 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744977 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744980 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744983 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744986 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744989 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744991 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744994 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744996 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.744999 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745001 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745003 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745006 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745010 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745012 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745015 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745017 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745020 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:07.745962 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745023 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745025 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745028 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745031 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745033 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745036 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745038 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745041 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745043 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745046 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745048 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745051 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745053 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745056 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745059 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745061 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745063 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745066 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745070 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:07.746469 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745074 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745077 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745079 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745082 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745084 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745087 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745090 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745092 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745095 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.745100 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745192 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745197 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745200 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745203 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745206 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745209 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:07.746939 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745212 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745214 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745217 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745220 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745222 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745225 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745242 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745245 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745247 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745251 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745254 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745256 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745259 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745262 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745265 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745267 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745270 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745272 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745275 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:07.747356 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745277 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745280 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745282 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745285 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745287 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745290 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745292 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745295 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745298 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745301 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745303 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745305 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745308 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745311 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745314 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745316 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745319 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745321 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745323 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745326 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:07.747802 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745328 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745331 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745333 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745336 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745338 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745341 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745343 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745346 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745348 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745350 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745353 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745355 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745358 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745360 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745363 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745365 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745368 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745371 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745373 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745376 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:07.748293 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745379 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745381 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745383 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745386 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745388 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745393 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745396 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745399 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745402 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745405 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745408 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745410 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745413 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745416 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745420 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745422 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745425 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745427 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745430 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:07.748765 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745432 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:07.749215 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:07.745435 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:07.749215 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.745439 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:07.749215 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.745548 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:36:07.749215 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.748040 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:36:07.749215 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.748911 2577 server.go:1019] "Starting client certificate rotation" Apr 22 18:36:07.749215 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.749003 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:36:07.749215 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.749035 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:36:07.772319 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.772303 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:36:07.776672 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.776656 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:36:07.792143 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.792127 2577 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:36:07.797453 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.797439 2577 log.go:25] "Validated CRI v1 image API" Apr 22 18:36:07.799215 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.799200 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:36:07.804244 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.804198 2577 fs.go:135] Filesystem UUIDs: map[107b5d49-1faa-4f45-b2ea-8066ef79f0e1:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 ec7920eb-e2b4-4952-86c6-11641a96d279:/dev/nvme0n1p3] Apr 22 18:36:07.804317 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.804243 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:36:07.808880 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.808859 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:36:07.810890 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.810788 2577 manager.go:217] Machine: {Timestamp:2026-04-22 18:36:07.808886831 +0000 UTC m=+0.377388634 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099504 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28f5eabb46e19f32aea00b8b15efd5 SystemUUID:ec28f5ea-bb46-e19f-32ae-a00b8b15efd5 BootID:3309674f-64be-420c-b374-8499caf3bd2f Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:16:2d:7e:40:bb Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:16:2d:7e:40:bb Speed:0 Mtu:9001} {Name:ovs-system MacAddress:aa:c1:dd:0b:9b:de Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:36:07.810890 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.810885 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:36:07.810995 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.810957 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:36:07.811930 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.811910 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:36:07.812050 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.811932 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-29.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:36:07.812091 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.812059 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:36:07.812091 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.812068 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:36:07.812091 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.812081 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:36:07.813525 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.813514 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:36:07.814213 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.814203 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:36:07.814347 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.814339 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:36:07.816462 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.816452 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:36:07.816500 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.816467 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:36:07.816500 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.816483 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:36:07.816500 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.816491 2577 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:36:07.816500 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.816500 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:36:07.817526 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.817512 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:36:07.817576 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.817537 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:36:07.820906 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.820890 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:36:07.822069 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.822056 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:36:07.824085 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824071 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:36:07.824085 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824088 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:36:07.824193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824094 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:36:07.824193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824099 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:36:07.824193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824105 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:36:07.824193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824110 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:36:07.824193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824116 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:36:07.824193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824122 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:36:07.824193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824130 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:36:07.824193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824136 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:36:07.824193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824146 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:36:07.824193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.824158 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:36:07.825039 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.825029 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:36:07.825039 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.825038 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:36:07.828560 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.828541 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-29.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:36:07.828626 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.828570 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:36:07.828626 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.828576 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-29.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:36:07.828968 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.828955 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:36:07.828999 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.828996 2577 server.go:1295] "Started kubelet" Apr 22 18:36:07.829125 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.829093 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:36:07.829176 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.829138 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:36:07.829176 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.829128 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:36:07.829719 ip-10-0-133-29 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:36:07.830275 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.830157 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:36:07.831219 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.831207 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:36:07.835390 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.835367 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:36:07.835390 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.835375 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:36:07.837180 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.837152 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:07.837307 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.837189 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:36:07.837425 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.837402 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:36:07.837518 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.837432 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:36:07.837622 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.837608 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:36:07.837740 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.837727 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:36:07.838735 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.838716 2577 factory.go:153] Registering CRI-O factory Apr 22 18:36:07.838854 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.838842 2577 factory.go:223] Registration of the crio container factory successfully Apr 22 18:36:07.839013 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.838993 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:36:07.839013 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.839013 2577 factory.go:55] Registering systemd factory Apr 22 18:36:07.839153 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.839031 2577 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:36:07.839153 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.839055 2577 factory.go:103] Registering Raw factory Apr 22 18:36:07.839153 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.839070 2577 manager.go:1196] Started watching for new ooms in manager Apr 22 18:36:07.839474 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.839448 2577 manager.go:319] Starting recovery of all containers Apr 22 18:36:07.839993 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.839970 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:36:07.840370 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.838135 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-29.ec2.internal.18a8c1a518e3322e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-29.ec2.internal,UID:ip-10-0-133-29.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-29.ec2.internal,},FirstTimestamp:2026-04-22 18:36:07.828967982 +0000 UTC m=+0.397469779,LastTimestamp:2026-04-22 18:36:07.828967982 +0000 UTC m=+0.397469779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-29.ec2.internal,}" Apr 22 18:36:07.846800 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.846774 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:36:07.846885 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.846803 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-29.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:36:07.850453 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.850437 2577 manager.go:324] Recovery completed Apr 22 18:36:07.852147 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.852126 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 18:36:07.854873 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.854861 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:07.857185 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.857162 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:07.857280 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.857196 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:07.857280 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.857206 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:07.857663 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.857650 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:36:07.857663 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.857662 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:36:07.857747 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.857676 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:36:07.859031 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.858970 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-29.ec2.internal.18a8c1a51a91beae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-29.ec2.internal,UID:ip-10-0-133-29.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-29.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-29.ec2.internal,},FirstTimestamp:2026-04-22 18:36:07.85718443 +0000 UTC m=+0.425686227,LastTimestamp:2026-04-22 18:36:07.85718443 +0000 UTC m=+0.425686227,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-29.ec2.internal,}" Apr 22 18:36:07.859624 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.859609 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-d4wck" Apr 22 18:36:07.859697 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.859640 2577 policy_none.go:49] "None policy: Start" Apr 22 18:36:07.859697 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.859656 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:36:07.859755 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.859702 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:36:07.869002 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.868986 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-d4wck" Apr 22 18:36:07.871028 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.870970 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-29.ec2.internal.18a8c1a51a91fdf2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-29.ec2.internal,UID:ip-10-0-133-29.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-133-29.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-133-29.ec2.internal,},FirstTimestamp:2026-04-22 18:36:07.857200626 +0000 UTC m=+0.425702424,LastTimestamp:2026-04-22 18:36:07.857200626 +0000 UTC m=+0.425702424,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-29.ec2.internal,}" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.899405 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.900477 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.900499 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.900536 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.900543 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.900572 2577 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.902830 2577 manager.go:341] "Starting Device Plugin manager" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.902863 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.902874 2577 server.go:85] "Starting device plugin registration server" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.902921 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.903060 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.903074 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.903133 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.903204 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:07.903211 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.903765 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:36:07.905008 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:07.903799 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.000695 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.000610 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal"] Apr 22 18:36:08.000695 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.000695 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:08.002607 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.002588 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:08.002711 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.002617 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:08.002711 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.002636 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:08.003692 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.003678 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:08.004361 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.004348 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:08.004446 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.004405 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:08.004446 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.004421 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:08.004446 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.004447 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.004958 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.004945 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:08.005100 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.005086 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.005159 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.005119 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:08.006382 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.006110 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:08.006382 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.006134 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:08.006382 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.006146 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:08.006867 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.006849 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:08.006949 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.006876 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:08.006949 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.006885 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:08.008444 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.008426 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.008444 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.008448 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:08.009623 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.009600 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:08.009698 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.009632 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:08.009698 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.009643 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:08.013085 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.013072 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.013129 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.013094 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-29.ec2.internal\": node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.029488 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.029469 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.033796 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.033779 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-29.ec2.internal\" not found" node="ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.038111 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.038097 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-29.ec2.internal\" not found" node="ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.039163 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.039149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c268259153a8e137f378914d3a6430a6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal\" (UID: \"c268259153a8e137f378914d3a6430a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.039211 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.039172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c268259153a8e137f378914d3a6430a6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal\" (UID: \"c268259153a8e137f378914d3a6430a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.039211 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.039187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6c73ad6357ff9dfe087208dfa7eeace3-config\") pod \"kube-apiserver-proxy-ip-10-0-133-29.ec2.internal\" (UID: \"6c73ad6357ff9dfe087208dfa7eeace3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.129587 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.129563 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.139909 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.139887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c268259153a8e137f378914d3a6430a6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal\" (UID: \"c268259153a8e137f378914d3a6430a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.139997 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.139917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c268259153a8e137f378914d3a6430a6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal\" (UID: \"c268259153a8e137f378914d3a6430a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.139997 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.139942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6c73ad6357ff9dfe087208dfa7eeace3-config\") pod \"kube-apiserver-proxy-ip-10-0-133-29.ec2.internal\" (UID: \"6c73ad6357ff9dfe087208dfa7eeace3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.140081 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.139995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c268259153a8e137f378914d3a6430a6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal\" (UID: \"c268259153a8e137f378914d3a6430a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.140081 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.140009 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c268259153a8e137f378914d3a6430a6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal\" (UID: \"c268259153a8e137f378914d3a6430a6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.140081 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.140059 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6c73ad6357ff9dfe087208dfa7eeace3-config\") pod \"kube-apiserver-proxy-ip-10-0-133-29.ec2.internal\" (UID: \"6c73ad6357ff9dfe087208dfa7eeace3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.229651 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.229628 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.330474 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.330424 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.335620 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.335604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.340773 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.340754 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal" Apr 22 18:36:08.430568 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.430535 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.531120 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.531093 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.631733 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.631656 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.732320 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.732290 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.748844 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.748821 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:36:08.749049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.748958 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:08.832382 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.832350 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:08.836269 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.836250 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:36:08.841700 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.841683 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:08.844961 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.844944 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:36:08.868194 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.868176 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-l9w85" Apr 22 18:36:08.869346 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:08.869324 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c73ad6357ff9dfe087208dfa7eeace3.slice/crio-8900e449f02520a7325c6af156b23d428979383115df13fdebf9121bd9020bb8 WatchSource:0}: Error finding container 8900e449f02520a7325c6af156b23d428979383115df13fdebf9121bd9020bb8: Status 404 returned error can't find the container with id 8900e449f02520a7325c6af156b23d428979383115df13fdebf9121bd9020bb8 Apr 22 18:36:08.869678 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:08.869661 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc268259153a8e137f378914d3a6430a6.slice/crio-aad0dac441fd7c31ecaf910b05d039d13dfc30b54c7638f2476324388b45998a WatchSource:0}: Error finding container aad0dac441fd7c31ecaf910b05d039d13dfc30b54c7638f2476324388b45998a: Status 404 returned error can't find the container with id aad0dac441fd7c31ecaf910b05d039d13dfc30b54c7638f2476324388b45998a Apr 22 18:36:08.870797 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.870750 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:31:07 +0000 UTC" deadline="2027-11-19 12:53:09.148115198 +0000 UTC" Apr 22 18:36:08.870863 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.870798 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13818h17m0.277321385s" Apr 22 18:36:08.873812 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.873798 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:36:08.878343 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.878327 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-l9w85" Apr 22 18:36:08.902973 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.902903 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal" event={"ID":"6c73ad6357ff9dfe087208dfa7eeace3","Type":"ContainerStarted","Data":"8900e449f02520a7325c6af156b23d428979383115df13fdebf9121bd9020bb8"} Apr 22 18:36:08.903780 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:08.903762 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" event={"ID":"c268259153a8e137f378914d3a6430a6","Type":"ContainerStarted","Data":"aad0dac441fd7c31ecaf910b05d039d13dfc30b54c7638f2476324388b45998a"} Apr 22 18:36:08.932922 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:08.932904 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:09.033541 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.033513 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:09.134095 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.134068 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:09.234722 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.234656 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-29.ec2.internal\" not found" Apr 22 18:36:09.329927 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.329900 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:09.335644 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.335624 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" Apr 22 18:36:09.375783 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.375654 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:36:09.376591 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.376570 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal" Apr 22 18:36:09.396296 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.396275 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:09.396858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.396778 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:36:09.817707 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.817683 2577 apiserver.go:52] "Watching apiserver" Apr 22 18:36:09.822785 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.822758 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:36:09.824214 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.824193 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dqfsv","openshift-network-diagnostics/network-check-target-p756f","openshift-ovn-kubernetes/ovnkube-node-pr85c","kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal","openshift-image-registry/node-ca-wm9hb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal","openshift-multus/multus-b7lx2","openshift-multus/network-metrics-daemon-vqnz5","openshift-network-operator/iptables-alerter-422g7","kube-system/konnectivity-agent-pjhsl","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7","openshift-cluster-node-tuning-operator/tuned-6vvzm"] Apr 22 18:36:09.827283 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.827261 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.829547 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.829448 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:09.829547 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.829520 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:09.829692 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.829653 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:36:09.829752 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.829654 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:36:09.829805 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.829781 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:36:09.829895 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.829657 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:36:09.830066 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.830051 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dht7c\"" Apr 22 18:36:09.830131 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.830067 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:36:09.832535 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.832040 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.834680 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.834417 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:36:09.834680 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.834430 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:36:09.834680 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.834514 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wcfs2\"" Apr 22 18:36:09.834871 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.834732 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:36:09.834871 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.834752 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:36:09.834871 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.834850 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:36:09.835071 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.835053 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:36:09.836631 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.836612 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:09.836719 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.836677 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.838599 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.838402 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:36:09.838706 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.838423 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:36:09.838706 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.838447 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-m6fn9\"" Apr 22 18:36:09.838706 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.838643 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:36:09.838841 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.838454 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:36:09.838916 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.838889 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:09.839017 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.838989 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:09.839102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.839067 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6r4sp\"" Apr 22 18:36:09.841427 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.841381 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:09.843421 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.843368 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:36:09.843586 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.843568 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:09.843662 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.843616 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:09.843785 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.843736 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6fjr7\"" Apr 22 18:36:09.843977 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.843961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:09.846202 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.846184 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:09.847846 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.847826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-kubelet\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.847923 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.847858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.847923 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.847878 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-daemon-config\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.847923 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.847897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89pwp\" (UniqueName: \"kubernetes.io/projected/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-kube-api-access-89pwp\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.848119 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.847920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-systemd-units\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848119 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.847943 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-run-netns\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848119 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.847964 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-run-systemd\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848119 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.847987 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-run-openvswitch\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848119 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-run-ovn\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848119 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-log-socket\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848119 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/946fd2db-0b92-4961-b670-a53e33d7f40f-ovnkube-script-lib\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848119 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848074 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/938b317c-4e75-4cee-9219-836d71fde87b-serviceca\") pod \"node-ca-wm9hb\" (UID: \"938b317c-4e75-4cee-9219-836d71fde87b\") " pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:09.848119 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-slash\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848128 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwncx\" (UniqueName: \"kubernetes.io/projected/938b317c-4e75-4cee-9219-836d71fde87b-kube-api-access-hwncx\") pod \"node-ca-wm9hb\" (UID: \"938b317c-4e75-4cee-9219-836d71fde87b\") " pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848160 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-cnibin\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-socket-dir-parent\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848213 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-var-lib-cni-bin\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848266 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-var-lib-cni-multus\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-etc-kubernetes\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848305 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-system-cni-dir\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-var-lib-openvswitch\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-run-ovn-kubernetes\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-cni-bin\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848404 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-cni-netd\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/946fd2db-0b92-4961-b670-a53e33d7f40f-ovn-node-metrics-cert\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-cni-binary-copy\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.848532 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848515 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-conf-dir\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/946fd2db-0b92-4961-b670-a53e33d7f40f-ovnkube-config\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/938b317c-4e75-4cee-9219-836d71fde87b-host\") pod \"node-ca-wm9hb\" (UID: \"938b317c-4e75-4cee-9219-836d71fde87b\") " pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848615 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-os-release\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw4b5\" (UniqueName: \"kubernetes.io/projected/f20c356d-ebd8-4177-92c7-8bf2571249a2-kube-api-access-kw4b5\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/447d848d-6ef3-4b39-a91c-16579bc83c6d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848681 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-run-netns\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848692 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-run-multus-certs\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848756 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2gjr\" (UniqueName: \"kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr\") pod \"network-check-target-p756f\" (UID: \"f632af1b-67e1-4b4d-9446-ea503297edd6\") " pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-etc-openvswitch\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-node-log\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848822 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp46s\" (UniqueName: \"kubernetes.io/projected/946fd2db-0b92-4961-b670-a53e33d7f40f-kube-api-access-rp46s\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848845 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-system-cni-dir\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-run-k8s-cni-cncf-io\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-hostroot\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.849101 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:09.849702 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.848970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/946fd2db-0b92-4961-b670-a53e33d7f40f-env-overrides\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.849702 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.849013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-cni-dir\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.849702 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.849046 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-var-lib-kubelet\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.849702 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.849069 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-cnibin\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.849702 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.849094 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-os-release\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.849702 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.849118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/447d848d-6ef3-4b39-a91c-16579bc83c6d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.849702 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.849167 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/447d848d-6ef3-4b39-a91c-16579bc83c6d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.849702 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.849201 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn2v2\" (UniqueName: \"kubernetes.io/projected/447d848d-6ef3-4b39-a91c-16579bc83c6d-kube-api-access-hn2v2\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.857027 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.857008 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:36:09.857248 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.857211 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fprjx\"" Apr 22 18:36:09.857315 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.857223 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:36:09.857795 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.857777 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:36:09.858030 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.858011 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:09.858167 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.858150 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:36:09.858334 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.858320 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mgxxx\"" Apr 22 18:36:09.858751 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.858730 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:36:09.858829 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.858734 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:09.859203 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.859182 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lqdrk\"" Apr 22 18:36:09.878884 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.878849 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:31:08 +0000 UTC" deadline="2027-10-17 09:27:30.565493227 +0000 UTC" Apr 22 18:36:09.878884 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.878872 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13022h51m20.686624813s" Apr 22 18:36:09.937357 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.937332 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:09.937950 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.937927 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:36:09.949691 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949672 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-var-lib-openvswitch\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.949805 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-cni-bin\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.949805 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-cni-netd\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.949805 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/946fd2db-0b92-4961-b670-a53e33d7f40f-ovn-node-metrics-cert\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.949805 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949742 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-conf-dir\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.949805 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949789 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-cni-bin\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.950049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-cni-netd\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.950049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949789 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-var-lib-openvswitch\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.950049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949814 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-systemd\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.950049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949849 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-conf-dir\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.950049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/938b317c-4e75-4cee-9219-836d71fde87b-host\") pod \"node-ca-wm9hb\" (UID: \"938b317c-4e75-4cee-9219-836d71fde87b\") " pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:09.950049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/938b317c-4e75-4cee-9219-836d71fde87b-host\") pod \"node-ca-wm9hb\" (UID: \"938b317c-4e75-4cee-9219-836d71fde87b\") " pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:09.950049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949927 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kw4b5\" (UniqueName: \"kubernetes.io/projected/f20c356d-ebd8-4177-92c7-8bf2571249a2-kube-api-access-kw4b5\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:09.950049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-device-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:09.950049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.949996 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8jsj\" (UniqueName: \"kubernetes.io/projected/85884605-a0fa-4cd6-b0b8-07548e94e655-kube-api-access-q8jsj\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:09.950049 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-tuned\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-cnibin\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950112 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-etc-openvswitch\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-cnibin\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-node-log\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-run-k8s-cni-cncf-io\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950249 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-run-k8s-cni-cncf-io\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-node-log\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/447d848d-6ef3-4b39-a91c-16579bc83c6d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-etc-openvswitch\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-daemon-config\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950373 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-kubernetes\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.950389 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:09.950538 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-sysctl-conf\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63465ecd-302d-4a7f-b3d3-9b9cc341d995-tmp\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.950482 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs podName:f20c356d-ebd8-4177-92c7-8bf2571249a2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:10.450439477 +0000 UTC m=+3.018941265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs") pod "network-metrics-daemon-vqnz5" (UID: "f20c356d-ebd8-4177-92c7-8bf2571249a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf8vb\" (UniqueName: \"kubernetes.io/projected/63465ecd-302d-4a7f-b3d3-9b9cc341d995-kube-api-access-vf8vb\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950522 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-sys-fs\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-systemd-units\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-run-systemd\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950597 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-cnibin\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-var-lib-cni-bin\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950667 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-var-lib-cni-bin\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/447d848d-6ef3-4b39-a91c-16579bc83c6d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-lib-modules\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-slash\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwncx\" (UniqueName: \"kubernetes.io/projected/938b317c-4e75-4cee-9219-836d71fde87b-kube-api-access-hwncx\") pod \"node-ca-wm9hb\" (UID: \"938b317c-4e75-4cee-9219-836d71fde87b\") " pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:09.951102 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950788 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/447d848d-6ef3-4b39-a91c-16579bc83c6d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-system-cni-dir\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950834 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-run-ovn-kubernetes\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-cni-binary-copy\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/946fd2db-0b92-4961-b670-a53e33d7f40f-ovnkube-config\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-systemd-units\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-os-release\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950925 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-run-systemd\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d6ae7655-3652-4c40-a767-81b41b44d742-agent-certs\") pod \"konnectivity-agent-pjhsl\" (UID: \"d6ae7655-3652-4c40-a767-81b41b44d742\") " pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950950 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-system-cni-dir\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d6ae7655-3652-4c40-a767-81b41b44d742-konnectivity-ca\") pod \"konnectivity-agent-pjhsl\" (UID: \"d6ae7655-3652-4c40-a767-81b41b44d742\") " pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-slash\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-run-netns\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-run-multus-certs\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f81323a-e699-4fb2-8971-b7fb4c18c2b3-iptables-alerter-script\") pod \"iptables-alerter-422g7\" (UID: \"4f81323a-e699-4fb2-8971-b7fb4c18c2b3\") " pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn2v2\" (UniqueName: \"kubernetes.io/projected/447d848d-6ef3-4b39-a91c-16579bc83c6d-kube-api-access-hn2v2\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951085 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/447d848d-6ef3-4b39-a91c-16579bc83c6d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.951763 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-run-ovn-kubernetes\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-run-netns\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-run-multus-certs\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.950861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-cnibin\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gjr\" (UniqueName: \"kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr\") pod \"network-check-target-p756f\" (UID: \"f632af1b-67e1-4b4d-9446-ea503297edd6\") " pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951350 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-os-release\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp46s\" (UniqueName: \"kubernetes.io/projected/946fd2db-0b92-4961-b670-a53e33d7f40f-kube-api-access-rp46s\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-system-cni-dir\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951475 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-system-cni-dir\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-hostroot\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-sysctl-d\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-daemon-config\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951636 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-hostroot\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-cni-binary-copy\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951667 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-run\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-var-lib-kubelet\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/946fd2db-0b92-4961-b670-a53e33d7f40f-env-overrides\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-cni-dir\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.952459 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951796 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-var-lib-kubelet\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951814 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/946fd2db-0b92-4961-b670-a53e33d7f40f-ovnkube-config\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951822 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-os-release\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951863 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/447d848d-6ef3-4b39-a91c-16579bc83c6d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/447d848d-6ef3-4b39-a91c-16579bc83c6d-os-release\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951866 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-cni-dir\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-var-lib-kubelet\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951895 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-sysconfig\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-host\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951955 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-registration-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.951981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-kubelet\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89pwp\" (UniqueName: \"kubernetes.io/projected/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-kube-api-access-89pwp\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952030 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-kubelet\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952046 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-socket-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/946fd2db-0b92-4961-b670-a53e33d7f40f-env-overrides\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952069 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-etc-selinux\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f81323a-e699-4fb2-8971-b7fb4c18c2b3-host-slash\") pod \"iptables-alerter-422g7\" (UID: \"4f81323a-e699-4fb2-8971-b7fb4c18c2b3\") " pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:09.953366 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbrtb\" (UniqueName: \"kubernetes.io/projected/4f81323a-e699-4fb2-8971-b7fb4c18c2b3-kube-api-access-kbrtb\") pod \"iptables-alerter-422g7\" (UID: \"4f81323a-e699-4fb2-8971-b7fb4c18c2b3\") " pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952183 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-run-netns\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-run-openvswitch\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952247 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-host-run-netns\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-run-ovn\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-run-openvswitch\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952284 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-log-socket\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-log-socket\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/447d848d-6ef3-4b39-a91c-16579bc83c6d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/946fd2db-0b92-4961-b670-a53e33d7f40f-ovnkube-script-lib\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952372 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/938b317c-4e75-4cee-9219-836d71fde87b-serviceca\") pod \"node-ca-wm9hb\" (UID: \"938b317c-4e75-4cee-9219-836d71fde87b\") " pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/946fd2db-0b92-4961-b670-a53e33d7f40f-run-ovn\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-socket-dir-parent\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-var-lib-cni-multus\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-etc-kubernetes\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-multus-socket-dir-parent\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952485 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-host-var-lib-cni-multus\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-modprobe-d\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.954124 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-sys\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:09.954953 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952532 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-etc-kubernetes\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.954953 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.952815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/946fd2db-0b92-4961-b670-a53e33d7f40f-ovnkube-script-lib\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.954953 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.953139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/938b317c-4e75-4cee-9219-836d71fde87b-serviceca\") pod \"node-ca-wm9hb\" (UID: \"938b317c-4e75-4cee-9219-836d71fde87b\") " pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:09.954953 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.953611 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/946fd2db-0b92-4961-b670-a53e33d7f40f-ovn-node-metrics-cert\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.965255 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.965196 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:09.965255 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.965216 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:09.965255 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.965242 2577 projected.go:194] Error preparing data for projected volume kube-api-access-b2gjr for pod openshift-network-diagnostics/network-check-target-p756f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:09.965452 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:09.965296 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr podName:f632af1b-67e1-4b4d-9446-ea503297edd6 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:10.465283893 +0000 UTC m=+3.033785682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-b2gjr" (UniqueName: "kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr") pod "network-check-target-p756f" (UID: "f632af1b-67e1-4b4d-9446-ea503297edd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:09.966817 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.966796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89pwp\" (UniqueName: \"kubernetes.io/projected/ff2fec6e-71e1-40b5-a159-2609e7db8ff5-kube-api-access-89pwp\") pod \"multus-b7lx2\" (UID: \"ff2fec6e-71e1-40b5-a159-2609e7db8ff5\") " pod="openshift-multus/multus-b7lx2" Apr 22 18:36:09.967287 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.967264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwncx\" (UniqueName: \"kubernetes.io/projected/938b317c-4e75-4cee-9219-836d71fde87b-kube-api-access-hwncx\") pod \"node-ca-wm9hb\" (UID: \"938b317c-4e75-4cee-9219-836d71fde87b\") " pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:09.967785 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.967748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp46s\" (UniqueName: \"kubernetes.io/projected/946fd2db-0b92-4961-b670-a53e33d7f40f-kube-api-access-rp46s\") pod \"ovnkube-node-pr85c\" (UID: \"946fd2db-0b92-4961-b670-a53e33d7f40f\") " pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:09.968022 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.967995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw4b5\" (UniqueName: \"kubernetes.io/projected/f20c356d-ebd8-4177-92c7-8bf2571249a2-kube-api-access-kw4b5\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:09.968976 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:09.968956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn2v2\" (UniqueName: \"kubernetes.io/projected/447d848d-6ef3-4b39-a91c-16579bc83c6d-kube-api-access-hn2v2\") pod \"multus-additional-cni-plugins-dqfsv\" (UID: \"447d848d-6ef3-4b39-a91c-16579bc83c6d\") " pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:10.052960 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.052926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-kubernetes\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.052960 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.052960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-sysctl-conf\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053177 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.052976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63465ecd-302d-4a7f-b3d3-9b9cc341d995-tmp\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053177 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.052994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vf8vb\" (UniqueName: \"kubernetes.io/projected/63465ecd-302d-4a7f-b3d3-9b9cc341d995-kube-api-access-vf8vb\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053177 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.052999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-kubernetes\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053177 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-sys-fs\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.053177 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-lib-modules\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053177 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-lib-modules\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053177 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.053177 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-sysctl-conf\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053177 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d6ae7655-3652-4c40-a767-81b41b44d742-agent-certs\") pod \"konnectivity-agent-pjhsl\" (UID: \"d6ae7655-3652-4c40-a767-81b41b44d742\") " pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-sys-fs\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053207 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d6ae7655-3652-4c40-a767-81b41b44d742-konnectivity-ca\") pod \"konnectivity-agent-pjhsl\" (UID: \"d6ae7655-3652-4c40-a767-81b41b44d742\") " pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053254 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f81323a-e699-4fb2-8971-b7fb4c18c2b3-iptables-alerter-script\") pod \"iptables-alerter-422g7\" (UID: \"4f81323a-e699-4fb2-8971-b7fb4c18c2b3\") " pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053220 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053305 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-sysctl-d\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053331 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-run\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-var-lib-kubelet\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-sysconfig\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053436 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-host\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-registration-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053505 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-socket-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053518 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-run\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-etc-selinux\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-host\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-etc-selinux\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f81323a-e699-4fb2-8971-b7fb4c18c2b3-host-slash\") pod \"iptables-alerter-422g7\" (UID: \"4f81323a-e699-4fb2-8971-b7fb4c18c2b3\") " pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:10.053584 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053594 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f81323a-e699-4fb2-8971-b7fb4c18c2b3-host-slash\") pod \"iptables-alerter-422g7\" (UID: \"4f81323a-e699-4fb2-8971-b7fb4c18c2b3\") " pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-sysctl-d\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbrtb\" (UniqueName: \"kubernetes.io/projected/4f81323a-e699-4fb2-8971-b7fb4c18c2b3-kube-api-access-kbrtb\") pod \"iptables-alerter-422g7\" (UID: \"4f81323a-e699-4fb2-8971-b7fb4c18c2b3\") " pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-var-lib-kubelet\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-modprobe-d\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053653 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-registration-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053674 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-sys\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-sysconfig\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-systemd\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-device-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8jsj\" (UniqueName: \"kubernetes.io/projected/85884605-a0fa-4cd6-b0b8-07548e94e655-kube-api-access-q8jsj\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053775 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-socket-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053784 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-tuned\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053808 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d6ae7655-3652-4c40-a767-81b41b44d742-konnectivity-ca\") pod \"konnectivity-agent-pjhsl\" (UID: \"d6ae7655-3652-4c40-a767-81b41b44d742\") " pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053816 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f81323a-e699-4fb2-8971-b7fb4c18c2b3-iptables-alerter-script\") pod \"iptables-alerter-422g7\" (UID: \"4f81323a-e699-4fb2-8971-b7fb4c18c2b3\") " pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053824 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-systemd\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-sys\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.054358 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053886 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/85884605-a0fa-4cd6-b0b8-07548e94e655-device-dir\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.055172 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.053904 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-modprobe-d\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.055420 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.055401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d6ae7655-3652-4c40-a767-81b41b44d742-agent-certs\") pod \"konnectivity-agent-pjhsl\" (UID: \"d6ae7655-3652-4c40-a767-81b41b44d742\") " pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:10.055677 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.055656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63465ecd-302d-4a7f-b3d3-9b9cc341d995-tmp\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.055929 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.055906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/63465ecd-302d-4a7f-b3d3-9b9cc341d995-etc-tuned\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.063449 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.063418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf8vb\" (UniqueName: \"kubernetes.io/projected/63465ecd-302d-4a7f-b3d3-9b9cc341d995-kube-api-access-vf8vb\") pod \"tuned-6vvzm\" (UID: \"63465ecd-302d-4a7f-b3d3-9b9cc341d995\") " pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.063822 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.063799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8jsj\" (UniqueName: \"kubernetes.io/projected/85884605-a0fa-4cd6-b0b8-07548e94e655-kube-api-access-q8jsj\") pod \"aws-ebs-csi-driver-node-k86t7\" (UID: \"85884605-a0fa-4cd6-b0b8-07548e94e655\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.064720 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.064699 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbrtb\" (UniqueName: \"kubernetes.io/projected/4f81323a-e699-4fb2-8971-b7fb4c18c2b3-kube-api-access-kbrtb\") pod \"iptables-alerter-422g7\" (UID: \"4f81323a-e699-4fb2-8971-b7fb4c18c2b3\") " pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:10.139811 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.139699 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" Apr 22 18:36:10.146581 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.146556 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:10.154455 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.154434 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wm9hb" Apr 22 18:36:10.160040 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.160023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b7lx2" Apr 22 18:36:10.166577 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.166558 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-422g7" Apr 22 18:36:10.172104 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.172087 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:10.178647 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.178617 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" Apr 22 18:36:10.183170 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.183149 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" Apr 22 18:36:10.456291 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.456210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:10.456439 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:10.456369 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:10.456439 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:10.456428 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs podName:f20c356d-ebd8-4177-92c7-8bf2571249a2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:11.4564136 +0000 UTC m=+4.024915391 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs") pod "network-metrics-daemon-vqnz5" (UID: "f20c356d-ebd8-4177-92c7-8bf2571249a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:10.556821 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.556794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gjr\" (UniqueName: \"kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr\") pod \"network-check-target-p756f\" (UID: \"f632af1b-67e1-4b4d-9446-ea503297edd6\") " pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:10.556962 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:10.556891 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:10.556962 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:10.556904 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:10.556962 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:10.556912 2577 projected.go:194] Error preparing data for projected volume kube-api-access-b2gjr for pod openshift-network-diagnostics/network-check-target-p756f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:10.556962 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:10.556956 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr podName:f632af1b-67e1-4b4d-9446-ea503297edd6 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:11.556941588 +0000 UTC m=+4.125443373 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2gjr" (UniqueName: "kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr") pod "network-check-target-p756f" (UID: "f632af1b-67e1-4b4d-9446-ea503297edd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:10.608397 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:10.608362 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6ae7655_3652_4c40_a767_81b41b44d742.slice/crio-9fab13d6214645b054bc31393d145e6d24d0355efc737d21aeae1cde31b78b3c WatchSource:0}: Error finding container 9fab13d6214645b054bc31393d145e6d24d0355efc737d21aeae1cde31b78b3c: Status 404 returned error can't find the container with id 9fab13d6214645b054bc31393d145e6d24d0355efc737d21aeae1cde31b78b3c Apr 22 18:36:10.611990 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:10.611966 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff2fec6e_71e1_40b5_a159_2609e7db8ff5.slice/crio-dd1db67fa596ab3d56e187014a0e13411c700348478b4828680592fe2d7c4a78 WatchSource:0}: Error finding container dd1db67fa596ab3d56e187014a0e13411c700348478b4828680592fe2d7c4a78: Status 404 returned error can't find the container with id dd1db67fa596ab3d56e187014a0e13411c700348478b4828680592fe2d7c4a78 Apr 22 18:36:10.612994 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:10.612972 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod946fd2db_0b92_4961_b670_a53e33d7f40f.slice/crio-c3a608632fde251a11f0eaf9c081de0ce636593e8f2a10028ad82ab185796e52 WatchSource:0}: Error finding container c3a608632fde251a11f0eaf9c081de0ce636593e8f2a10028ad82ab185796e52: Status 404 returned error can't find the container with id c3a608632fde251a11f0eaf9c081de0ce636593e8f2a10028ad82ab185796e52 Apr 22 18:36:10.614282 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:10.614264 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63465ecd_302d_4a7f_b3d3_9b9cc341d995.slice/crio-f820f0fc8bae21f0cb66d37759e6d05ebab3dbd887f59b7971ddaf05e5593e88 WatchSource:0}: Error finding container f820f0fc8bae21f0cb66d37759e6d05ebab3dbd887f59b7971ddaf05e5593e88: Status 404 returned error can't find the container with id f820f0fc8bae21f0cb66d37759e6d05ebab3dbd887f59b7971ddaf05e5593e88 Apr 22 18:36:10.615214 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:10.615183 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f81323a_e699_4fb2_8971_b7fb4c18c2b3.slice/crio-538d34a8f7a6078011efe9bbcead3b187bfca0693cb0bec11f82482cc708e0ae WatchSource:0}: Error finding container 538d34a8f7a6078011efe9bbcead3b187bfca0693cb0bec11f82482cc708e0ae: Status 404 returned error can't find the container with id 538d34a8f7a6078011efe9bbcead3b187bfca0693cb0bec11f82482cc708e0ae Apr 22 18:36:10.616420 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:10.616390 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod447d848d_6ef3_4b39_a91c_16579bc83c6d.slice/crio-6e5c8f58c5b0080cb3353348443ecbaccb41957ba5e5b4364f5bd459a0781f5b WatchSource:0}: Error finding container 6e5c8f58c5b0080cb3353348443ecbaccb41957ba5e5b4364f5bd459a0781f5b: Status 404 returned error can't find the container with id 6e5c8f58c5b0080cb3353348443ecbaccb41957ba5e5b4364f5bd459a0781f5b Apr 22 18:36:10.618038 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:10.617724 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod938b317c_4e75_4cee_9219_836d71fde87b.slice/crio-16e8eb3deaebe86f09663b2eaee14061eebc0ae677126396557f7f0173b90921 WatchSource:0}: Error finding container 16e8eb3deaebe86f09663b2eaee14061eebc0ae677126396557f7f0173b90921: Status 404 returned error can't find the container with id 16e8eb3deaebe86f09663b2eaee14061eebc0ae677126396557f7f0173b90921 Apr 22 18:36:10.618455 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:10.618302 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85884605_a0fa_4cd6_b0b8_07548e94e655.slice/crio-044eacc0163f2e84cf0e57d6c9604d9ab3b773b2c8cf7982f097a09235199cad WatchSource:0}: Error finding container 044eacc0163f2e84cf0e57d6c9604d9ab3b773b2c8cf7982f097a09235199cad: Status 404 returned error can't find the container with id 044eacc0163f2e84cf0e57d6c9604d9ab3b773b2c8cf7982f097a09235199cad Apr 22 18:36:10.879569 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.879498 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:31:08 +0000 UTC" deadline="2027-10-11 13:38:38.998528142 +0000 UTC" Apr 22 18:36:10.879569 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.879525 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12883h2m28.119005252s" Apr 22 18:36:10.901738 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.901716 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:10.901855 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:10.901804 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:10.908890 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.908863 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wm9hb" event={"ID":"938b317c-4e75-4cee-9219-836d71fde87b","Type":"ContainerStarted","Data":"16e8eb3deaebe86f09663b2eaee14061eebc0ae677126396557f7f0173b90921"} Apr 22 18:36:10.910347 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.910319 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" event={"ID":"447d848d-6ef3-4b39-a91c-16579bc83c6d","Type":"ContainerStarted","Data":"6e5c8f58c5b0080cb3353348443ecbaccb41957ba5e5b4364f5bd459a0781f5b"} Apr 22 18:36:10.911224 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.911181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-422g7" event={"ID":"4f81323a-e699-4fb2-8971-b7fb4c18c2b3","Type":"ContainerStarted","Data":"538d34a8f7a6078011efe9bbcead3b187bfca0693cb0bec11f82482cc708e0ae"} Apr 22 18:36:10.912159 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.912112 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" event={"ID":"63465ecd-302d-4a7f-b3d3-9b9cc341d995","Type":"ContainerStarted","Data":"f820f0fc8bae21f0cb66d37759e6d05ebab3dbd887f59b7971ddaf05e5593e88"} Apr 22 18:36:10.913081 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.913063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" event={"ID":"946fd2db-0b92-4961-b670-a53e33d7f40f","Type":"ContainerStarted","Data":"c3a608632fde251a11f0eaf9c081de0ce636593e8f2a10028ad82ab185796e52"} Apr 22 18:36:10.914353 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.914333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal" event={"ID":"6c73ad6357ff9dfe087208dfa7eeace3","Type":"ContainerStarted","Data":"6ca5f39e94a4a9764bc022616446fe8a42f04db98c1c783e80f8b3ffa8bb9aff"} Apr 22 18:36:10.915336 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.915318 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" event={"ID":"85884605-a0fa-4cd6-b0b8-07548e94e655","Type":"ContainerStarted","Data":"044eacc0163f2e84cf0e57d6c9604d9ab3b773b2c8cf7982f097a09235199cad"} Apr 22 18:36:10.916255 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.916204 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b7lx2" event={"ID":"ff2fec6e-71e1-40b5-a159-2609e7db8ff5","Type":"ContainerStarted","Data":"dd1db67fa596ab3d56e187014a0e13411c700348478b4828680592fe2d7c4a78"} Apr 22 18:36:10.917118 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.917101 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pjhsl" event={"ID":"d6ae7655-3652-4c40-a767-81b41b44d742","Type":"ContainerStarted","Data":"9fab13d6214645b054bc31393d145e6d24d0355efc737d21aeae1cde31b78b3c"} Apr 22 18:36:10.931911 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:10.931871 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-29.ec2.internal" podStartSLOduration=1.931860337 podStartE2EDuration="1.931860337s" podCreationTimestamp="2026-04-22 18:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:10.931588742 +0000 UTC m=+3.500090552" watchObservedRunningTime="2026-04-22 18:36:10.931860337 +0000 UTC m=+3.500362215" Apr 22 18:36:11.463206 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:11.463123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:11.463366 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:11.463325 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:11.463437 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:11.463390 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs podName:f20c356d-ebd8-4177-92c7-8bf2571249a2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:13.46337131 +0000 UTC m=+6.031873102 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs") pod "network-metrics-daemon-vqnz5" (UID: "f20c356d-ebd8-4177-92c7-8bf2571249a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:11.564397 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:11.563671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gjr\" (UniqueName: \"kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr\") pod \"network-check-target-p756f\" (UID: \"f632af1b-67e1-4b4d-9446-ea503297edd6\") " pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:11.564397 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:11.563861 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:11.564397 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:11.563881 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:11.564397 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:11.563894 2577 projected.go:194] Error preparing data for projected volume kube-api-access-b2gjr for pod openshift-network-diagnostics/network-check-target-p756f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:11.564397 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:11.563978 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr podName:f632af1b-67e1-4b4d-9446-ea503297edd6 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:13.563929477 +0000 UTC m=+6.132431279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2gjr" (UniqueName: "kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr") pod "network-check-target-p756f" (UID: "f632af1b-67e1-4b4d-9446-ea503297edd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:11.901460 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:11.901368 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:11.902058 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:11.901497 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:11.927563 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:11.927508 2577 generic.go:358] "Generic (PLEG): container finished" podID="c268259153a8e137f378914d3a6430a6" containerID="1e0b73fe70f07d38b728031dc0ab8c9bc01cc472a1e54d884342e15751cba4c9" exitCode=0 Apr 22 18:36:11.928542 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:11.928517 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" event={"ID":"c268259153a8e137f378914d3a6430a6","Type":"ContainerDied","Data":"1e0b73fe70f07d38b728031dc0ab8c9bc01cc472a1e54d884342e15751cba4c9"} Apr 22 18:36:12.902063 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:12.901552 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:12.902063 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:12.901672 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:12.937748 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:12.937706 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" event={"ID":"c268259153a8e137f378914d3a6430a6","Type":"ContainerStarted","Data":"7f0d264dac4f1c21137e180ddca6f8c94a7ba93c77fee0634322c2641fba8f24"} Apr 22 18:36:13.478477 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:13.478439 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:13.478669 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:13.478596 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:13.478669 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:13.478658 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs podName:f20c356d-ebd8-4177-92c7-8bf2571249a2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:17.478640356 +0000 UTC m=+10.047142145 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs") pod "network-metrics-daemon-vqnz5" (UID: "f20c356d-ebd8-4177-92c7-8bf2571249a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:13.579215 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:13.579182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gjr\" (UniqueName: \"kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr\") pod \"network-check-target-p756f\" (UID: \"f632af1b-67e1-4b4d-9446-ea503297edd6\") " pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:13.579386 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:13.579368 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:13.579386 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:13.579386 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:13.579567 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:13.579399 2577 projected.go:194] Error preparing data for projected volume kube-api-access-b2gjr for pod openshift-network-diagnostics/network-check-target-p756f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:13.579567 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:13.579452 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr podName:f632af1b-67e1-4b4d-9446-ea503297edd6 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:17.579434401 +0000 UTC m=+10.147936202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2gjr" (UniqueName: "kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr") pod "network-check-target-p756f" (UID: "f632af1b-67e1-4b4d-9446-ea503297edd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:13.901573 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:13.901498 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:13.901732 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:13.901638 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:14.842262 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.842122 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-29.ec2.internal" podStartSLOduration=5.842106597 podStartE2EDuration="5.842106597s" podCreationTimestamp="2026-04-22 18:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:12.957976782 +0000 UTC m=+5.526478591" watchObservedRunningTime="2026-04-22 18:36:14.842106597 +0000 UTC m=+7.410608403" Apr 22 18:36:14.842776 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.842718 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fbr4l"] Apr 22 18:36:14.845714 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.845689 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:14.845852 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:14.845768 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:14.890760 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.890546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b3a1c2eb-91b0-417e-818b-e08b94eca20e-dbus\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:14.890760 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.890618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:14.890760 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.890692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b3a1c2eb-91b0-417e-818b-e08b94eca20e-kubelet-config\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:14.901329 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.901306 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:14.901456 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:14.901417 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:14.991011 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.990970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:14.991167 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.991053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b3a1c2eb-91b0-417e-818b-e08b94eca20e-kubelet-config\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:14.991167 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.991084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b3a1c2eb-91b0-417e-818b-e08b94eca20e-dbus\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:14.991167 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:14.991131 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:14.991353 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:14.991202 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret podName:b3a1c2eb-91b0-417e-818b-e08b94eca20e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:15.491183826 +0000 UTC m=+8.059685628 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret") pod "global-pull-secret-syncer-fbr4l" (UID: "b3a1c2eb-91b0-417e-818b-e08b94eca20e") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:14.991353 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.991285 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b3a1c2eb-91b0-417e-818b-e08b94eca20e-dbus\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:14.991353 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:14.991339 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b3a1c2eb-91b0-417e-818b-e08b94eca20e-kubelet-config\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:15.495130 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:15.495092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:15.495325 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:15.495259 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:15.495395 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:15.495330 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret podName:b3a1c2eb-91b0-417e-818b-e08b94eca20e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:16.495310424 +0000 UTC m=+9.063812209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret") pod "global-pull-secret-syncer-fbr4l" (UID: "b3a1c2eb-91b0-417e-818b-e08b94eca20e") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:15.900911 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:15.900835 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:15.900911 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:15.900879 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:15.901314 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:15.900974 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:15.901407 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:15.901383 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:16.503417 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:16.503374 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:16.503589 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:16.503522 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:16.503589 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:16.503585 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret podName:b3a1c2eb-91b0-417e-818b-e08b94eca20e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:18.503564693 +0000 UTC m=+11.072066490 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret") pod "global-pull-secret-syncer-fbr4l" (UID: "b3a1c2eb-91b0-417e-818b-e08b94eca20e") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:16.901796 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:16.901725 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:16.902196 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:16.901833 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:17.513649 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:17.513603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:17.513807 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:17.513755 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:17.513866 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:17.513820 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs podName:f20c356d-ebd8-4177-92c7-8bf2571249a2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:25.513800549 +0000 UTC m=+18.082302337 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs") pod "network-metrics-daemon-vqnz5" (UID: "f20c356d-ebd8-4177-92c7-8bf2571249a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:17.614423 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:17.614363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gjr\" (UniqueName: \"kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr\") pod \"network-check-target-p756f\" (UID: \"f632af1b-67e1-4b4d-9446-ea503297edd6\") " pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:17.614567 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:17.614529 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:17.614567 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:17.614554 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:17.614567 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:17.614569 2577 projected.go:194] Error preparing data for projected volume kube-api-access-b2gjr for pod openshift-network-diagnostics/network-check-target-p756f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:17.614679 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:17.614634 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr podName:f632af1b-67e1-4b4d-9446-ea503297edd6 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:25.614611094 +0000 UTC m=+18.183112902 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2gjr" (UniqueName: "kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr") pod "network-check-target-p756f" (UID: "f632af1b-67e1-4b4d-9446-ea503297edd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:17.902931 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:17.902349 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:17.902931 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:17.902367 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:17.902931 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:17.902471 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:17.902931 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:17.902556 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:18.521000 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:18.520962 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:18.521192 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:18.521098 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:18.521192 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:18.521157 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret podName:b3a1c2eb-91b0-417e-818b-e08b94eca20e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:22.521140406 +0000 UTC m=+15.089642195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret") pod "global-pull-secret-syncer-fbr4l" (UID: "b3a1c2eb-91b0-417e-818b-e08b94eca20e") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:18.900843 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:18.900763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:18.900999 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:18.900885 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:19.901468 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:19.901430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:19.901947 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:19.901430 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:19.901947 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:19.901576 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:19.901947 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:19.901629 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:20.901711 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:20.901678 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:20.902166 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:20.901795 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:21.900849 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:21.900787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:21.901046 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:21.900787 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:21.901046 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:21.900906 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:21.901046 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:21.901012 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:22.553019 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:22.552982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:22.553482 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:22.553146 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:22.553482 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:22.553209 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret podName:b3a1c2eb-91b0-417e-818b-e08b94eca20e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:30.553191811 +0000 UTC m=+23.121693613 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret") pod "global-pull-secret-syncer-fbr4l" (UID: "b3a1c2eb-91b0-417e-818b-e08b94eca20e") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:22.901490 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:22.901417 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:22.901627 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:22.901539 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:23.901493 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:23.901458 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:23.901493 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:23.901487 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:23.901951 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:23.901586 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:23.901951 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:23.901749 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:24.901038 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:24.901002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:24.901370 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:24.901104 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:25.578427 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:25.578388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:25.578874 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:25.578510 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:25.578874 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:25.578577 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs podName:f20c356d-ebd8-4177-92c7-8bf2571249a2 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:41.578558225 +0000 UTC m=+34.147060012 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs") pod "network-metrics-daemon-vqnz5" (UID: "f20c356d-ebd8-4177-92c7-8bf2571249a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:25.678993 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:25.678959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gjr\" (UniqueName: \"kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr\") pod \"network-check-target-p756f\" (UID: \"f632af1b-67e1-4b4d-9446-ea503297edd6\") " pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:25.679149 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:25.679119 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:25.679149 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:25.679141 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:25.679249 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:25.679155 2577 projected.go:194] Error preparing data for projected volume kube-api-access-b2gjr for pod openshift-network-diagnostics/network-check-target-p756f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:25.679249 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:25.679218 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr podName:f632af1b-67e1-4b4d-9446-ea503297edd6 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:41.679198156 +0000 UTC m=+34.247699946 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-b2gjr" (UniqueName: "kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr") pod "network-check-target-p756f" (UID: "f632af1b-67e1-4b4d-9446-ea503297edd6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:25.901134 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:25.901065 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:25.901286 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:25.901189 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:25.901286 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:25.901254 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:25.901384 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:25.901362 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:26.901169 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:26.901136 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:26.901563 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:26.901289 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:27.902500 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.902089 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:27.902935 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.902142 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:27.902935 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:27.902603 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:27.902935 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:27.902645 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:27.961820 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.961786 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b7lx2" event={"ID":"ff2fec6e-71e1-40b5-a159-2609e7db8ff5","Type":"ContainerStarted","Data":"100294ede824afc2dc3ff0f115773d9478d646b6c29aa3c07f4e73d2f7011dd8"} Apr 22 18:36:27.962868 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.962841 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pjhsl" event={"ID":"d6ae7655-3652-4c40-a767-81b41b44d742","Type":"ContainerStarted","Data":"8bc8f9b15c4e280a8abe7e7bf4007d6cf5a9c7c6d592e9410f6b52a556928927"} Apr 22 18:36:27.967979 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.967958 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wm9hb" event={"ID":"938b317c-4e75-4cee-9219-836d71fde87b","Type":"ContainerStarted","Data":"5bd9a6003dd03d14234656dcecb9525d2905cb8cf15e2b7cea39777f23b4c62a"} Apr 22 18:36:27.969428 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.969398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" event={"ID":"447d848d-6ef3-4b39-a91c-16579bc83c6d","Type":"ContainerStarted","Data":"d0824b8632ab35516b574b1b75449f6c87e2499410ab2728ed524f2269fd011e"} Apr 22 18:36:27.970844 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.970820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" event={"ID":"63465ecd-302d-4a7f-b3d3-9b9cc341d995","Type":"ContainerStarted","Data":"2fd8601f799baa0d3a3f9170551ea194097ff4f83d4a2a64581cbd96546aeb62"} Apr 22 18:36:27.973503 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.973484 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" event={"ID":"946fd2db-0b92-4961-b670-a53e33d7f40f","Type":"ContainerStarted","Data":"10e8d5abcbd0e7854adbd89292bb2d1cfa0a7ab795a9084d95091c0d3b8d7ca7"} Apr 22 18:36:27.973503 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.973507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" event={"ID":"946fd2db-0b92-4961-b670-a53e33d7f40f","Type":"ContainerStarted","Data":"a3d422ed95a769edb9b6d0c19a71a20c8c566f7608dde3e4e430b99004225d34"} Apr 22 18:36:27.973716 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.973517 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" event={"ID":"946fd2db-0b92-4961-b670-a53e33d7f40f","Type":"ContainerStarted","Data":"1796c8355ebd3467c28f73343f0456d0e308a5b5dc47e56b31fd8d14af24767a"} Apr 22 18:36:27.973716 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.973529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" event={"ID":"946fd2db-0b92-4961-b670-a53e33d7f40f","Type":"ContainerStarted","Data":"15d152f576c75cd7cc5d1de75c61920f96196a6cb54a96a21b533264b8f6401e"} Apr 22 18:36:27.974691 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.974669 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" event={"ID":"85884605-a0fa-4cd6-b0b8-07548e94e655","Type":"ContainerStarted","Data":"a5d52f68fc2178374457af243fd212bfebca86d40eeb960b2981b80ed0a95c4c"} Apr 22 18:36:27.981154 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.981117 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-b7lx2" podStartSLOduration=3.085280981 podStartE2EDuration="19.981105748s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="2026-04-22 18:36:10.613739033 +0000 UTC m=+3.182240822" lastFinishedPulling="2026-04-22 18:36:27.509563804 +0000 UTC m=+20.078065589" observedRunningTime="2026-04-22 18:36:27.980699694 +0000 UTC m=+20.549201502" watchObservedRunningTime="2026-04-22 18:36:27.981105748 +0000 UTC m=+20.549607555" Apr 22 18:36:27.995968 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:27.995932 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wm9hb" podStartSLOduration=7.908702975 podStartE2EDuration="19.99592005s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="2026-04-22 18:36:10.620264922 +0000 UTC m=+3.188766709" lastFinishedPulling="2026-04-22 18:36:22.707481995 +0000 UTC m=+15.275983784" observedRunningTime="2026-04-22 18:36:27.99575034 +0000 UTC m=+20.564252147" watchObservedRunningTime="2026-04-22 18:36:27.99592005 +0000 UTC m=+20.564421856" Apr 22 18:36:28.010830 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.010767 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pjhsl" podStartSLOduration=3.5272837409999998 podStartE2EDuration="20.010752839s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="2026-04-22 18:36:10.610721064 +0000 UTC m=+3.179222849" lastFinishedPulling="2026-04-22 18:36:27.094190162 +0000 UTC m=+19.662691947" observedRunningTime="2026-04-22 18:36:28.010185295 +0000 UTC m=+20.578687093" watchObservedRunningTime="2026-04-22 18:36:28.010752839 +0000 UTC m=+20.579254644" Apr 22 18:36:28.639781 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.639760 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:36:28.900714 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.900688 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:28.900849 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:28.900803 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:28.911300 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.911207 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:36:28.639776935Z","UUID":"8f5e5f6d-4b04-4cba-9b99-e0fa902e6f74","Handler":null,"Name":"","Endpoint":""} Apr 22 18:36:28.914399 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.914382 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:36:28.914496 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.914404 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:36:28.978561 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.978535 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" event={"ID":"85884605-a0fa-4cd6-b0b8-07548e94e655","Type":"ContainerStarted","Data":"9db88f28cc9cf7b4669a51520e1adbcfddb0e53aca6c7348d7b99fc26338ddfd"} Apr 22 18:36:28.979777 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.979760 2577 generic.go:358] "Generic (PLEG): container finished" podID="447d848d-6ef3-4b39-a91c-16579bc83c6d" containerID="d0824b8632ab35516b574b1b75449f6c87e2499410ab2728ed524f2269fd011e" exitCode=0 Apr 22 18:36:28.979847 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.979813 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" event={"ID":"447d848d-6ef3-4b39-a91c-16579bc83c6d","Type":"ContainerDied","Data":"d0824b8632ab35516b574b1b75449f6c87e2499410ab2728ed524f2269fd011e"} Apr 22 18:36:28.981053 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.981029 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-422g7" event={"ID":"4f81323a-e699-4fb2-8971-b7fb4c18c2b3","Type":"ContainerStarted","Data":"33d932827ab9e9be16076e243038ec4603d5281f4c912ecd826c825bde7aba61"} Apr 22 18:36:28.983456 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.983432 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" event={"ID":"946fd2db-0b92-4961-b670-a53e33d7f40f","Type":"ContainerStarted","Data":"2134333f2802f4bdbd1cd01d9e58f7863e7cdef0b8a6d67f1fac9b72db238af2"} Apr 22 18:36:28.983540 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.983458 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" event={"ID":"946fd2db-0b92-4961-b670-a53e33d7f40f","Type":"ContainerStarted","Data":"49582236558b8ed7b2b879492e0cb87d0925b93d4d891fed48157cab4492fd76"} Apr 22 18:36:28.999568 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:28.999534 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6vvzm" podStartSLOduration=4.116021612 podStartE2EDuration="20.999523348s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="2026-04-22 18:36:10.61693379 +0000 UTC m=+3.185435576" lastFinishedPulling="2026-04-22 18:36:27.500435527 +0000 UTC m=+20.068937312" observedRunningTime="2026-04-22 18:36:28.060675004 +0000 UTC m=+20.629176812" watchObservedRunningTime="2026-04-22 18:36:28.999523348 +0000 UTC m=+21.568025179" Apr 22 18:36:29.901407 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:29.901190 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:29.901603 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:29.901219 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:29.901603 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:29.901512 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:29.901699 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:29.901639 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:29.987064 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:29.987020 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" event={"ID":"85884605-a0fa-4cd6-b0b8-07548e94e655","Type":"ContainerStarted","Data":"0fbdc93822e70fc74ec736998f353bcc2c1f270e24e98bb446cb7010e901aef6"} Apr 22 18:36:30.003728 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:30.003688 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-422g7" podStartSLOduration=5.122725359 podStartE2EDuration="22.003675197s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="2026-04-22 18:36:10.617923487 +0000 UTC m=+3.186425276" lastFinishedPulling="2026-04-22 18:36:27.498873316 +0000 UTC m=+20.067375114" observedRunningTime="2026-04-22 18:36:29.015937359 +0000 UTC m=+21.584439163" watchObservedRunningTime="2026-04-22 18:36:30.003675197 +0000 UTC m=+22.572177006" Apr 22 18:36:30.003875 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:30.003847 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k86t7" podStartSLOduration=3.16675403 podStartE2EDuration="22.003839159s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="2026-04-22 18:36:10.620811148 +0000 UTC m=+3.189312935" lastFinishedPulling="2026-04-22 18:36:29.457896277 +0000 UTC m=+22.026398064" observedRunningTime="2026-04-22 18:36:30.003132278 +0000 UTC m=+22.571634097" watchObservedRunningTime="2026-04-22 18:36:30.003839159 +0000 UTC m=+22.572340966" Apr 22 18:36:30.615939 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:30.615900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:30.616098 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:30.616028 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:30.616098 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:30.616088 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret podName:b3a1c2eb-91b0-417e-818b-e08b94eca20e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:46.616069558 +0000 UTC m=+39.184571360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret") pod "global-pull-secret-syncer-fbr4l" (UID: "b3a1c2eb-91b0-417e-818b-e08b94eca20e") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:30.901661 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:30.901580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:30.901816 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:30.901683 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:30.992782 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:30.992735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" event={"ID":"946fd2db-0b92-4961-b670-a53e33d7f40f","Type":"ContainerStarted","Data":"1a3dcff8b2f23475a1b18b664e3fe5a0a14d5522060a98336c72e89c8b83d240"} Apr 22 18:36:31.901042 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:31.901008 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:31.901224 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:31.901008 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:31.901224 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:31.901146 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:31.901224 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:31.901212 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:32.702900 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:32.702868 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:32.703673 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:32.703656 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:32.901380 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:32.901352 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:32.901549 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:32.901521 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:33.901762 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:33.901578 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:33.902264 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:33.901578 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:33.902264 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:33.901839 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:33.902264 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:33.901902 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:33.999075 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:33.999053 2577 generic.go:358] "Generic (PLEG): container finished" podID="447d848d-6ef3-4b39-a91c-16579bc83c6d" containerID="3fe5a25f152fd276ce67956192481657ed51007942f3a73c39bd00b7725f81e0" exitCode=0 Apr 22 18:36:33.999204 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:33.999119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" event={"ID":"447d848d-6ef3-4b39-a91c-16579bc83c6d","Type":"ContainerDied","Data":"3fe5a25f152fd276ce67956192481657ed51007942f3a73c39bd00b7725f81e0"} Apr 22 18:36:34.002182 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:34.002164 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" event={"ID":"946fd2db-0b92-4961-b670-a53e33d7f40f","Type":"ContainerStarted","Data":"718eac2b675f82e41768279eccce8e24ba7257bb72a3f2c90b827670f6e65d69"} Apr 22 18:36:34.002475 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:34.002458 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:34.016543 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:34.016518 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:34.050519 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:34.050483 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" podStartSLOduration=8.994182086 podStartE2EDuration="26.05047154s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="2026-04-22 18:36:10.615465208 +0000 UTC m=+3.183967006" lastFinishedPulling="2026-04-22 18:36:27.671754663 +0000 UTC m=+20.240256460" observedRunningTime="2026-04-22 18:36:34.049761913 +0000 UTC m=+26.618263719" watchObservedRunningTime="2026-04-22 18:36:34.05047154 +0000 UTC m=+26.618973347" Apr 22 18:36:34.900993 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:34.900931 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:34.901143 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:34.901021 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:35.005028 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.005005 2577 generic.go:358] "Generic (PLEG): container finished" podID="447d848d-6ef3-4b39-a91c-16579bc83c6d" containerID="d2497243a6dd2355471d0108f33612dc8ead011f78868dd1da7f19a41e171270" exitCode=0 Apr 22 18:36:35.005390 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.005097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" event={"ID":"447d848d-6ef3-4b39-a91c-16579bc83c6d","Type":"ContainerDied","Data":"d2497243a6dd2355471d0108f33612dc8ead011f78868dd1da7f19a41e171270"} Apr 22 18:36:35.005390 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.005144 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:36:35.005578 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.005541 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:35.018952 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.018931 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:35.045115 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.045069 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fbr4l"] Apr 22 18:36:35.045219 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.045151 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:35.045303 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:35.045273 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:35.055784 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.055753 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-p756f"] Apr 22 18:36:35.055898 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.055826 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vqnz5"] Apr 22 18:36:35.055898 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.055879 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:35.056061 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:35.056032 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:35.056115 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.056086 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:35.056200 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:35.056185 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:35.109622 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:35.109597 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:36:36.008783 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:36.008722 2577 generic.go:358] "Generic (PLEG): container finished" podID="447d848d-6ef3-4b39-a91c-16579bc83c6d" containerID="c2ac4f98d3870c531d47062940f3b9a89c0263745c0227fdd0213424b48ee2d8" exitCode=0 Apr 22 18:36:36.009086 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:36.008807 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" event={"ID":"447d848d-6ef3-4b39-a91c-16579bc83c6d","Type":"ContainerDied","Data":"c2ac4f98d3870c531d47062940f3b9a89c0263745c0227fdd0213424b48ee2d8"} Apr 22 18:36:36.900913 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:36.900841 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:36.901109 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:36.900841 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:36.901109 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:36.900964 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:36.901109 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:36.901016 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:36.901288 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:36.901109 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:36.901288 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:36.901224 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:38.563922 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:38.563890 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:38.564771 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:38.564043 2577 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:36:38.564771 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:38.564483 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pjhsl" Apr 22 18:36:38.901769 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:38.901646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:38.901769 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:38.901646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:38.901769 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:38.901713 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:38.902019 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:38.901827 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p756f" podUID="f632af1b-67e1-4b4d-9446-ea503297edd6" Apr 22 18:36:38.902019 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:38.901911 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vqnz5" podUID="f20c356d-ebd8-4177-92c7-8bf2571249a2" Apr 22 18:36:38.902019 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:38.901990 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fbr4l" podUID="b3a1c2eb-91b0-417e-818b-e08b94eca20e" Apr 22 18:36:39.243269 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.243224 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-29.ec2.internal" event="NodeReady" Apr 22 18:36:39.243402 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.243390 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:36:39.284332 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.284309 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr"] Apr 22 18:36:39.314351 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.314078 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s"] Apr 22 18:36:39.314645 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.314622 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.317061 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.316919 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:36:39.317061 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.316980 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:36:39.317061 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.317018 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:36:39.317372 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.317145 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:36:39.317372 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.316930 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:36:39.317372 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.317265 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:36:39.317372 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.317365 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:36:39.330879 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.330841 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56667fd489-p9kvx"] Apr 22 18:36:39.331010 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.330993 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" Apr 22 18:36:39.333471 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.333451 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:36:39.333821 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.333793 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-fsz25\"" Apr 22 18:36:39.348667 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.348642 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855"] Apr 22 18:36:39.348809 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.348789 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.351017 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.350997 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:36:39.351113 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.351045 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:36:39.351113 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.351064 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:36:39.351113 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.351077 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8gvfs\"" Apr 22 18:36:39.356460 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.356442 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:36:39.363876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.363791 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr"] Apr 22 18:36:39.363876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.363817 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s"] Apr 22 18:36:39.363876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.363829 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855"] Apr 22 18:36:39.363876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.363839 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56667fd489-p9kvx"] Apr 22 18:36:39.363876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.363853 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vtrxb"] Apr 22 18:36:39.364172 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.364001 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.367992 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.367973 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:36:39.378435 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.378414 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vtrxb"] Apr 22 18:36:39.378539 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.378526 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:39.380736 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.380604 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:36:39.380736 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.380635 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:36:39.380736 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.380672 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x8r7b\"" Apr 22 18:36:39.380736 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.380608 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:36:39.403453 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.403423 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vc87s"] Apr 22 18:36:39.418933 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.418911 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.421699 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.421679 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:36:39.422085 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.422039 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:36:39.422271 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.422220 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:36:39.422406 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.422376 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:36:39.422589 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.422575 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xfhxc\"" Apr 22 18:36:39.422674 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.422624 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vc87s"] Apr 22 18:36:39.490115 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-registry-certificates\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.490299 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490123 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxp7\" (UniqueName: \"kubernetes.io/projected/6cb93b16-d01c-455e-9091-30190e9bd5b0-kube-api-access-gdxp7\") pod \"managed-serviceaccount-addon-agent-754879b4d5-4fp2s\" (UID: \"6cb93b16-d01c-455e-9091-30190e9bd5b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" Apr 22 18:36:39.490299 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-installation-pull-secrets\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.490299 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490195 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-ca\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.490299 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490246 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-hub\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.490299 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490269 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-bound-sa-token\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.490299 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490290 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw6l5\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-kube-api-access-zw6l5\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.490514 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490305 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8n9f\" (UniqueName: \"kubernetes.io/projected/8a55aad3-c17e-4873-8971-af9571fa6014-kube-api-access-v8n9f\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.490514 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3af1833a-5bf4-4b71-9f7b-9ca741fcdc32-tmp\") pod \"klusterlet-addon-workmgr-5c4556f8d4-qd855\" (UID: \"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.490514 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8a55aad3-c17e-4873-8971-af9571fa6014-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.490514 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490452 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmk62\" (UniqueName: \"kubernetes.io/projected/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-kube-api-access-jmk62\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:39.490514 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6cb93b16-d01c-455e-9091-30190e9bd5b0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-754879b4d5-4fp2s\" (UID: \"6cb93b16-d01c-455e-9091-30190e9bd5b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" Apr 22 18:36:39.490514 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:39.490748 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d54dx\" (UniqueName: \"kubernetes.io/projected/3af1833a-5bf4-4b71-9f7b-9ca741fcdc32-kube-api-access-d54dx\") pod \"klusterlet-addon-workmgr-5c4556f8d4-qd855\" (UID: \"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.490748 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.490748 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.490748 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.490748 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/171e80c5-873b-40fc-b152-1a96147b240e-ca-trust-extracted\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.490748 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490640 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3af1833a-5bf4-4b71-9f7b-9ca741fcdc32-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c4556f8d4-qd855\" (UID: \"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.490748 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-trusted-ca\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.490748 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.490680 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-image-registry-private-configuration\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.591487 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-ca\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.591487 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-hub\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-bound-sa-token\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw6l5\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-kube-api-access-zw6l5\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8n9f\" (UniqueName: \"kubernetes.io/projected/8a55aad3-c17e-4873-8971-af9571fa6014-kube-api-access-v8n9f\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3af1833a-5bf4-4b71-9f7b-9ca741fcdc32-tmp\") pod \"klusterlet-addon-workmgr-5c4556f8d4-qd855\" (UID: \"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591615 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8a55aad3-c17e-4873-8971-af9571fa6014-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmk62\" (UniqueName: \"kubernetes.io/projected/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-kube-api-access-jmk62\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6cb93b16-d01c-455e-9091-30190e9bd5b0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-754879b4d5-4fp2s\" (UID: \"6cb93b16-d01c-455e-9091-30190e9bd5b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591753 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d54dx\" (UniqueName: \"kubernetes.io/projected/3af1833a-5bf4-4b71-9f7b-9ca741fcdc32-kube-api-access-d54dx\") pod \"klusterlet-addon-workmgr-5c4556f8d4-qd855\" (UID: \"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591776 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591833 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25870685-4492-459c-a0e2-82a05d74127a-tmp-dir\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591862 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c84qw\" (UniqueName: \"kubernetes.io/projected/25870685-4492-459c-a0e2-82a05d74127a-kube-api-access-c84qw\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.592107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591955 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/171e80c5-873b-40fc-b152-1a96147b240e-ca-trust-extracted\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.591980 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3af1833a-5bf4-4b71-9f7b-9ca741fcdc32-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c4556f8d4-qd855\" (UID: \"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.592008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-trusted-ca\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:39.592090 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:39.592175 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert podName:84a0c8c1-7bbb-41ec-86d3-1c379b8789ff nodeName:}" failed. No retries permitted until 2026-04-22 18:36:40.092154367 +0000 UTC m=+32.660656166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert") pod "ingress-canary-vtrxb" (UID: "84a0c8c1-7bbb-41ec-86d3-1c379b8789ff") : secret "canary-serving-cert" not found Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.592523 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8a55aad3-c17e-4873-8971-af9571fa6014-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:39.592607 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:39.592619 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56667fd489-p9kvx: secret "image-registry-tls" not found Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:39.592661 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls podName:171e80c5-873b-40fc-b152-1a96147b240e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:40.092644219 +0000 UTC m=+32.661146004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls") pod "image-registry-56667fd489-p9kvx" (UID: "171e80c5-873b-40fc-b152-1a96147b240e") : secret "image-registry-tls" not found Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.592682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25870685-4492-459c-a0e2-82a05d74127a-config-volume\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.592721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-image-registry-private-configuration\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.592758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-registry-certificates\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.592787 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxp7\" (UniqueName: \"kubernetes.io/projected/6cb93b16-d01c-455e-9091-30190e9bd5b0-kube-api-access-gdxp7\") pod \"managed-serviceaccount-addon-agent-754879b4d5-4fp2s\" (UID: \"6cb93b16-d01c-455e-9091-30190e9bd5b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" Apr 22 18:36:39.592829 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.592811 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-installation-pull-secrets\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.593462 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.592978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-trusted-ca\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.593462 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.593221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/171e80c5-873b-40fc-b152-1a96147b240e-ca-trust-extracted\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.593948 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.593924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-registry-certificates\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.596971 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.596925 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3af1833a-5bf4-4b71-9f7b-9ca741fcdc32-tmp\") pod \"klusterlet-addon-workmgr-5c4556f8d4-qd855\" (UID: \"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.597089 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.597041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-ca\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.597302 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.597199 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.597592 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.597574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3af1833a-5bf4-4b71-9f7b-9ca741fcdc32-klusterlet-config\") pod \"klusterlet-addon-workmgr-5c4556f8d4-qd855\" (UID: \"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.597681 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.597630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-installation-pull-secrets\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.597803 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.597765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-image-registry-private-configuration\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.597893 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.597870 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6cb93b16-d01c-455e-9091-30190e9bd5b0-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-754879b4d5-4fp2s\" (UID: \"6cb93b16-d01c-455e-9091-30190e9bd5b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" Apr 22 18:36:39.598361 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.598338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.598514 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.598491 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8a55aad3-c17e-4873-8971-af9571fa6014-hub\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.604086 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.604056 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-bound-sa-token\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.604761 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.604638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw6l5\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-kube-api-access-zw6l5\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:39.604862 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.604781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmk62\" (UniqueName: \"kubernetes.io/projected/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-kube-api-access-jmk62\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:39.604862 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.604843 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8n9f\" (UniqueName: \"kubernetes.io/projected/8a55aad3-c17e-4873-8971-af9571fa6014-kube-api-access-v8n9f\") pod \"cluster-proxy-proxy-agent-85c66c68c9-p2xdr\" (UID: \"8a55aad3-c17e-4873-8971-af9571fa6014\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.605992 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.605967 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d54dx\" (UniqueName: \"kubernetes.io/projected/3af1833a-5bf4-4b71-9f7b-9ca741fcdc32-kube-api-access-d54dx\") pod \"klusterlet-addon-workmgr-5c4556f8d4-qd855\" (UID: \"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.606094 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.606027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxp7\" (UniqueName: \"kubernetes.io/projected/6cb93b16-d01c-455e-9091-30190e9bd5b0-kube-api-access-gdxp7\") pod \"managed-serviceaccount-addon-agent-754879b4d5-4fp2s\" (UID: \"6cb93b16-d01c-455e-9091-30190e9bd5b0\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" Apr 22 18:36:39.635960 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.635931 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:36:39.648043 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.648022 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" Apr 22 18:36:39.677916 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.677867 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:39.694109 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.694087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.694218 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.694126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25870685-4492-459c-a0e2-82a05d74127a-tmp-dir\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.694218 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.694180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c84qw\" (UniqueName: \"kubernetes.io/projected/25870685-4492-459c-a0e2-82a05d74127a-kube-api-access-c84qw\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.694351 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.694247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25870685-4492-459c-a0e2-82a05d74127a-config-volume\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.694351 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:39.694265 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:39.694351 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:39.694339 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls podName:25870685-4492-459c-a0e2-82a05d74127a nodeName:}" failed. No retries permitted until 2026-04-22 18:36:40.194317377 +0000 UTC m=+32.762819182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls") pod "dns-default-vc87s" (UID: "25870685-4492-459c-a0e2-82a05d74127a") : secret "dns-default-metrics-tls" not found Apr 22 18:36:39.695754 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.695734 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25870685-4492-459c-a0e2-82a05d74127a-tmp-dir\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.695973 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.695944 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25870685-4492-459c-a0e2-82a05d74127a-config-volume\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.700582 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.700558 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-sdrsv"] Apr 22 18:36:39.709911 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.709883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c84qw\" (UniqueName: \"kubernetes.io/projected/25870685-4492-459c-a0e2-82a05d74127a-kube-api-access-c84qw\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:39.715472 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.715452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:39.718078 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.718043 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-44zj9\"" Apr 22 18:36:39.895760 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.895724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71861e02-68a2-47be-ad90-6fd2d00d058b-tmp-dir\") pod \"node-resolver-sdrsv\" (UID: \"71861e02-68a2-47be-ad90-6fd2d00d058b\") " pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:39.895932 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.895807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c28w\" (UniqueName: \"kubernetes.io/projected/71861e02-68a2-47be-ad90-6fd2d00d058b-kube-api-access-4c28w\") pod \"node-resolver-sdrsv\" (UID: \"71861e02-68a2-47be-ad90-6fd2d00d058b\") " pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:39.895932 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.895842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71861e02-68a2-47be-ad90-6fd2d00d058b-hosts-file\") pod \"node-resolver-sdrsv\" (UID: \"71861e02-68a2-47be-ad90-6fd2d00d058b\") " pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:39.997046 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.996972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71861e02-68a2-47be-ad90-6fd2d00d058b-tmp-dir\") pod \"node-resolver-sdrsv\" (UID: \"71861e02-68a2-47be-ad90-6fd2d00d058b\") " pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:39.997046 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.997029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4c28w\" (UniqueName: \"kubernetes.io/projected/71861e02-68a2-47be-ad90-6fd2d00d058b-kube-api-access-4c28w\") pod \"node-resolver-sdrsv\" (UID: \"71861e02-68a2-47be-ad90-6fd2d00d058b\") " pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:39.997217 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.997055 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71861e02-68a2-47be-ad90-6fd2d00d058b-hosts-file\") pod \"node-resolver-sdrsv\" (UID: \"71861e02-68a2-47be-ad90-6fd2d00d058b\") " pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:39.997217 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.997177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71861e02-68a2-47be-ad90-6fd2d00d058b-hosts-file\") pod \"node-resolver-sdrsv\" (UID: \"71861e02-68a2-47be-ad90-6fd2d00d058b\") " pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:39.997382 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:39.997334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71861e02-68a2-47be-ad90-6fd2d00d058b-tmp-dir\") pod \"node-resolver-sdrsv\" (UID: \"71861e02-68a2-47be-ad90-6fd2d00d058b\") " pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:40.009530 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.009500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c28w\" (UniqueName: \"kubernetes.io/projected/71861e02-68a2-47be-ad90-6fd2d00d058b-kube-api-access-4c28w\") pod \"node-resolver-sdrsv\" (UID: \"71861e02-68a2-47be-ad90-6fd2d00d058b\") " pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:40.026318 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.026293 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sdrsv" Apr 22 18:36:40.098327 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.098297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:40.098512 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.098363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:40.098512 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:40.098454 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:40.098512 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:40.098478 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56667fd489-p9kvx: secret "image-registry-tls" not found Apr 22 18:36:40.098667 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:40.098535 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls podName:171e80c5-873b-40fc-b152-1a96147b240e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:41.098517687 +0000 UTC m=+33.667019495 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls") pod "image-registry-56667fd489-p9kvx" (UID: "171e80c5-873b-40fc-b152-1a96147b240e") : secret "image-registry-tls" not found Apr 22 18:36:40.098667 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:40.098453 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:40.098667 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:40.098571 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert podName:84a0c8c1-7bbb-41ec-86d3-1c379b8789ff nodeName:}" failed. No retries permitted until 2026-04-22 18:36:41.098559294 +0000 UTC m=+33.667061079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert") pod "ingress-canary-vtrxb" (UID: "84a0c8c1-7bbb-41ec-86d3-1c379b8789ff") : secret "canary-serving-cert" not found Apr 22 18:36:40.199281 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.199220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:40.199435 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:40.199350 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:40.199435 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:40.199412 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls podName:25870685-4492-459c-a0e2-82a05d74127a nodeName:}" failed. No retries permitted until 2026-04-22 18:36:41.199396236 +0000 UTC m=+33.767898021 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls") pod "dns-default-vc87s" (UID: "25870685-4492-459c-a0e2-82a05d74127a") : secret "dns-default-metrics-tls" not found Apr 22 18:36:40.901443 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.901412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:40.902143 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.901411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:40.902143 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.901412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:40.905326 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.905043 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-95dwn\"" Apr 22 18:36:40.905326 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.905066 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:36:40.905326 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.905220 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:36:40.905592 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.905347 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:36:40.905592 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.905415 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zwsxn\"" Apr 22 18:36:40.905592 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:40.905470 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:36:41.106209 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.106175 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:41.106399 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.106289 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:41.106399 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.106306 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:41.106399 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.106327 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56667fd489-p9kvx: secret "image-registry-tls" not found Apr 22 18:36:41.106399 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.106387 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls podName:171e80c5-873b-40fc-b152-1a96147b240e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:43.106372213 +0000 UTC m=+35.674874000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls") pod "image-registry-56667fd489-p9kvx" (UID: "171e80c5-873b-40fc-b152-1a96147b240e") : secret "image-registry-tls" not found Apr 22 18:36:41.106563 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.106409 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:41.106563 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.106466 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert podName:84a0c8c1-7bbb-41ec-86d3-1c379b8789ff nodeName:}" failed. No retries permitted until 2026-04-22 18:36:43.106451388 +0000 UTC m=+35.674953173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert") pod "ingress-canary-vtrxb" (UID: "84a0c8c1-7bbb-41ec-86d3-1c379b8789ff") : secret "canary-serving-cert" not found Apr 22 18:36:41.207630 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.207552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:41.207797 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.207697 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:41.207797 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.207762 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls podName:25870685-4492-459c-a0e2-82a05d74127a nodeName:}" failed. No retries permitted until 2026-04-22 18:36:43.207741037 +0000 UTC m=+35.776242822 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls") pod "dns-default-vc87s" (UID: "25870685-4492-459c-a0e2-82a05d74127a") : secret "dns-default-metrics-tls" not found Apr 22 18:36:41.297871 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.297839 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w"] Apr 22 18:36:41.320660 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.320444 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vllgk"] Apr 22 18:36:41.320660 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.320611 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w" Apr 22 18:36:41.325071 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.325048 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-9sb27\"" Apr 22 18:36:41.325196 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.325164 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:41.325317 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.325301 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:41.345482 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.345312 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w"] Apr 22 18:36:41.345569 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.345498 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vllgk"] Apr 22 18:36:41.345569 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.345448 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.348523 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.348506 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-tsppm\"" Apr 22 18:36:41.348786 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.348768 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:36:41.348850 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.348823 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:41.348954 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.348937 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:36:41.351702 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.351683 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:41.354490 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.354471 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:36:41.399087 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.399049 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh"] Apr 22 18:36:41.407697 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.407675 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk"] Apr 22 18:36:41.407876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.407858 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:41.409524 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.409502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49lg2\" (UniqueName: \"kubernetes.io/projected/17abeeb9-50cd-4f77-9ee1-e387268f4e5f-kube-api-access-49lg2\") pod \"volume-data-source-validator-7c6cbb6c87-rmv4w\" (UID: \"17abeeb9-50cd-4f77-9ee1-e387268f4e5f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w" Apr 22 18:36:41.410329 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.410304 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-4zsj9\"" Apr 22 18:36:41.410428 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.410378 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:36:41.410528 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.410512 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:41.410594 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.410517 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:41.415159 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.415136 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq"] Apr 22 18:36:41.415344 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.415327 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk" Apr 22 18:36:41.417359 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.417339 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-x748p\"" Apr 22 18:36:41.423782 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.423610 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dxw7s"] Apr 22 18:36:41.423782 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.423765 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.425919 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.425896 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:41.426027 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.425972 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:36:41.426027 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.425903 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:36:41.426292 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.426270 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-t99x9\"" Apr 22 18:36:41.426387 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.426343 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:41.433343 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.433321 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7786869f5d-r5qgc"] Apr 22 18:36:41.433848 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.433481 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.436955 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.436936 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:36:41.437250 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.437062 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-g8hcq\"" Apr 22 18:36:41.438152 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.437805 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:36:41.438152 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.437856 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:36:41.438152 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.438120 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:36:41.442322 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.442300 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:36:41.444372 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.444348 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk"] Apr 22 18:36:41.444488 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.444385 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh"] Apr 22 18:36:41.444488 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.444398 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dxw7s"] Apr 22 18:36:41.444592 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.444519 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7786869f5d-r5qgc"] Apr 22 18:36:41.444592 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.444543 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.444671 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.444548 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq"] Apr 22 18:36:41.446318 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.446297 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:36:41.446623 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.446602 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:36:41.446963 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.446652 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:36:41.446963 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.446728 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-ds9w2\"" Apr 22 18:36:41.446963 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.446603 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:36:41.446963 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.446823 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:36:41.447154 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.447129 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:36:41.510175 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510076 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-serving-cert\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.510352 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49lg2\" (UniqueName: \"kubernetes.io/projected/17abeeb9-50cd-4f77-9ee1-e387268f4e5f-kube-api-access-49lg2\") pod \"volume-data-source-validator-7c6cbb6c87-rmv4w\" (UID: \"17abeeb9-50cd-4f77-9ee1-e387268f4e5f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w" Apr 22 18:36:41.510352 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwsqp\" (UniqueName: \"kubernetes.io/projected/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-kube-api-access-gwsqp\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.510489 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-config\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.510769 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:41.510899 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77f0cc31-e890-43ad-a890-116520afc036-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-tkzvq\" (UID: \"77f0cc31-e890-43ad-a890-116520afc036\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.510899 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llmv9\" (UniqueName: \"kubernetes.io/projected/b75815e5-b1cc-44af-8c53-32f3fc0feaec-kube-api-access-llmv9\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:41.510899 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbjgt\" (UniqueName: \"kubernetes.io/projected/77f0cc31-e890-43ad-a890-116520afc036-kube-api-access-hbjgt\") pod \"kube-storage-version-migrator-operator-6769c5d45-tkzvq\" (UID: \"77f0cc31-e890-43ad-a890-116520afc036\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.511056 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510910 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-trusted-ca\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.511056 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510946 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f0cc31-e890-43ad-a890-116520afc036-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-tkzvq\" (UID: \"77f0cc31-e890-43ad-a890-116520afc036\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.511056 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.510981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjvdv\" (UniqueName: \"kubernetes.io/projected/40336ed8-b40b-4340-9566-2d80600db3d6-kube-api-access-kjvdv\") pod \"network-check-source-8894fc9bd-wpxbk\" (UID: \"40336ed8-b40b-4340-9566-2d80600db3d6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk" Apr 22 18:36:41.511409 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.511386 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h"] Apr 22 18:36:41.522730 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.522706 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq"] Apr 22 18:36:41.522879 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.522864 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:41.526325 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.526304 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:36:41.527531 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.527512 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-wdqjw\"" Apr 22 18:36:41.528064 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.528047 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:36:41.530687 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.530607 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:36:41.537709 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.536188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.537709 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.537267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49lg2\" (UniqueName: \"kubernetes.io/projected/17abeeb9-50cd-4f77-9ee1-e387268f4e5f-kube-api-access-49lg2\") pod \"volume-data-source-validator-7c6cbb6c87-rmv4w\" (UID: \"17abeeb9-50cd-4f77-9ee1-e387268f4e5f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w" Apr 22 18:36:41.537979 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.537871 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq"] Apr 22 18:36:41.547371 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.547177 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:36:41.548622 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.548601 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h"] Apr 22 18:36:41.550043 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.550024 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:41.560208 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.560114 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:36:41.560554 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.560518 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:41.560554 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.560542 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:36:41.560768 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.560747 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-2bpbh\"" Apr 22 18:36:41.612153 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nctfj\" (UniqueName: \"kubernetes.io/projected/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-kube-api-access-nctfj\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.612318 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:41.612318 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-default-certificate\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.612318 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.612318 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612292 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-snapshots\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.612318 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.612306 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:36:41.612493 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.612384 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls podName:b75815e5-b1cc-44af-8c53-32f3fc0feaec nodeName:}" failed. No retries permitted until 2026-04-22 18:36:42.112367428 +0000 UTC m=+34.680869219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fljkh" (UID: "b75815e5-b1cc-44af-8c53-32f3fc0feaec") : secret "samples-operator-tls" not found Apr 22 18:36:41.612493 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.612493 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612443 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsndg\" (UniqueName: \"kubernetes.io/projected/94003369-e42a-4e67-ab30-f6abbd37dcc7-kube-api-access-bsndg\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.612493 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77f0cc31-e890-43ad-a890-116520afc036-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-tkzvq\" (UID: \"77f0cc31-e890-43ad-a890-116520afc036\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.612644 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612502 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llmv9\" (UniqueName: \"kubernetes.io/projected/b75815e5-b1cc-44af-8c53-32f3fc0feaec-kube-api-access-llmv9\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:41.612644 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612528 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbjgt\" (UniqueName: \"kubernetes.io/projected/77f0cc31-e890-43ad-a890-116520afc036-kube-api-access-hbjgt\") pod \"kube-storage-version-migrator-operator-6769c5d45-tkzvq\" (UID: \"77f0cc31-e890-43ad-a890-116520afc036\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.612644 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-trusted-ca\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.612644 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f0cc31-e890-43ad-a890-116520afc036-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-tkzvq\" (UID: \"77f0cc31-e890-43ad-a890-116520afc036\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.612644 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjvdv\" (UniqueName: \"kubernetes.io/projected/40336ed8-b40b-4340-9566-2d80600db3d6-kube-api-access-kjvdv\") pod \"network-check-source-8894fc9bd-wpxbk\" (UID: \"40336ed8-b40b-4340-9566-2d80600db3d6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk" Apr 22 18:36:41.612876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-stats-auth\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.612876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612721 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-tmp\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.612876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.612876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-serving-cert\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.612876 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-service-ca-bundle\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.613107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-serving-cert\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.613107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d89211d4-cc09-4e34-bb17-86aaf93fef39-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:41.613107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwsqp\" (UniqueName: \"kubernetes.io/projected/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-kube-api-access-gwsqp\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.613107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.612979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-config\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.613107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.613008 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:41.613373 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.613257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:36:41.613373 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.613304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7mgz\" (UniqueName: \"kubernetes.io/projected/d89211d4-cc09-4e34-bb17-86aaf93fef39-kube-api-access-n7mgz\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:41.613373 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.613308 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f0cc31-e890-43ad-a890-116520afc036-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-tkzvq\" (UID: \"77f0cc31-e890-43ad-a890-116520afc036\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.613491 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.613433 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:36:41.613535 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.613491 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs podName:f20c356d-ebd8-4177-92c7-8bf2571249a2 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:13.613475813 +0000 UTC m=+66.181977601 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs") pod "network-metrics-daemon-vqnz5" (UID: "f20c356d-ebd8-4177-92c7-8bf2571249a2") : secret "metrics-daemon-secret" not found Apr 22 18:36:41.613632 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.613612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-config\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.613665 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.613622 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-trusted-ca\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.614730 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.614713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77f0cc31-e890-43ad-a890-116520afc036-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-tkzvq\" (UID: \"77f0cc31-e890-43ad-a890-116520afc036\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.614983 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.614967 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-serving-cert\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.624200 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.624180 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbjgt\" (UniqueName: \"kubernetes.io/projected/77f0cc31-e890-43ad-a890-116520afc036-kube-api-access-hbjgt\") pod \"kube-storage-version-migrator-operator-6769c5d45-tkzvq\" (UID: \"77f0cc31-e890-43ad-a890-116520afc036\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.624496 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.624478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llmv9\" (UniqueName: \"kubernetes.io/projected/b75815e5-b1cc-44af-8c53-32f3fc0feaec-kube-api-access-llmv9\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:41.627428 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.627407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwsqp\" (UniqueName: \"kubernetes.io/projected/5e96d5c9-534c-4e08-b6a6-5f20b407b3e3-kube-api-access-gwsqp\") pod \"console-operator-9d4b6777b-vllgk\" (UID: \"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3\") " pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.627732 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.627706 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjvdv\" (UniqueName: \"kubernetes.io/projected/40336ed8-b40b-4340-9566-2d80600db3d6-kube-api-access-kjvdv\") pod \"network-check-source-8894fc9bd-wpxbk\" (UID: \"40336ed8-b40b-4340-9566-2d80600db3d6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk" Apr 22 18:36:41.633470 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.633452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w" Apr 22 18:36:41.656915 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.656268 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:36:41.664326 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:41.664261 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71861e02_68a2_47be_ad90_6fd2d00d058b.slice/crio-1cd739e09ba8cf73a087dda59669b27be45eef6835c07ead0ee95e5b4d2fe055 WatchSource:0}: Error finding container 1cd739e09ba8cf73a087dda59669b27be45eef6835c07ead0ee95e5b4d2fe055: Status 404 returned error can't find the container with id 1cd739e09ba8cf73a087dda59669b27be45eef6835c07ead0ee95e5b4d2fe055 Apr 22 18:36:41.715056 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.714935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7mgz\" (UniqueName: \"kubernetes.io/projected/d89211d4-cc09-4e34-bb17-86aaf93fef39-kube-api-access-n7mgz\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:41.715056 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nctfj\" (UniqueName: \"kubernetes.io/projected/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-kube-api-access-nctfj\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.715450 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715414 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88a79e60-2ccb-4f68-bbaf-9ae07317bdf1-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tt2kq\" (UID: \"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.715546 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-default-certificate\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.715546 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.715546 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-snapshots\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.715759 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715551 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88a79e60-2ccb-4f68-bbaf-9ae07317bdf1-config\") pod \"service-ca-operator-d6fc45fc5-tt2kq\" (UID: \"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.715759 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.715759 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsndg\" (UniqueName: \"kubernetes.io/projected/94003369-e42a-4e67-ab30-f6abbd37dcc7-kube-api-access-bsndg\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.715759 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715683 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gjr\" (UniqueName: \"kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr\") pod \"network-check-target-p756f\" (UID: \"f632af1b-67e1-4b4d-9446-ea503297edd6\") " pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:41.715759 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715711 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-stats-auth\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.715759 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.715734 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:36:41.716159 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.715800 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:42.215780688 +0000 UTC m=+34.784282480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : secret "router-metrics-certs-default" not found Apr 22 18:36:41.716159 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.716026 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:42.216006185 +0000 UTC m=+34.784507993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : configmap references non-existent config key: service-ca.crt Apr 22 18:36:41.716159 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.716093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-tmp\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.716771 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.715739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-tmp\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.716771 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.716568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.716771 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.716589 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-snapshots\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.716771 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.716604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k4rg\" (UniqueName: \"kubernetes.io/projected/88a79e60-2ccb-4f68-bbaf-9ae07317bdf1-kube-api-access-4k4rg\") pod \"service-ca-operator-d6fc45fc5-tt2kq\" (UID: \"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.716771 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.716755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-service-ca-bundle\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.717625 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.716782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-serving-cert\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.717625 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.716809 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d89211d4-cc09-4e34-bb17-86aaf93fef39-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:41.717625 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.716849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:41.717625 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.716961 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:41.717625 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:41.717003 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls podName:d89211d4-cc09-4e34-bb17-86aaf93fef39 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:42.216989552 +0000 UTC m=+34.785491343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-dbp2h" (UID: "d89211d4-cc09-4e34-bb17-86aaf93fef39") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:41.717913 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.717661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/d89211d4-cc09-4e34-bb17-86aaf93fef39-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:41.718311 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.718186 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-service-ca-bundle\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.718311 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.718302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.719709 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.719676 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-default-certificate\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.720094 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.720070 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-stats-auth\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.720790 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.720768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2gjr\" (UniqueName: \"kubernetes.io/projected/f632af1b-67e1-4b4d-9446-ea503297edd6-kube-api-access-b2gjr\") pod \"network-check-target-p756f\" (UID: \"f632af1b-67e1-4b4d-9446-ea503297edd6\") " pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:41.721563 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.721545 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-serving-cert\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.727385 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.727159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7mgz\" (UniqueName: \"kubernetes.io/projected/d89211d4-cc09-4e34-bb17-86aaf93fef39-kube-api-access-n7mgz\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:41.730493 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.728296 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk" Apr 22 18:36:41.730493 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.729632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nctfj\" (UniqueName: \"kubernetes.io/projected/79c52d40-7d6b-4114-9d7d-80fa7fd19bf2-kube-api-access-nctfj\") pod \"insights-operator-585dfdc468-dxw7s\" (UID: \"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2\") " pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.731338 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.731035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsndg\" (UniqueName: \"kubernetes.io/projected/94003369-e42a-4e67-ab30-f6abbd37dcc7-kube-api-access-bsndg\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:41.738831 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.738779 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" Apr 22 18:36:41.758307 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.753539 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-dxw7s" Apr 22 18:36:41.818070 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.817769 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88a79e60-2ccb-4f68-bbaf-9ae07317bdf1-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tt2kq\" (UID: \"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.818070 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.817831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88a79e60-2ccb-4f68-bbaf-9ae07317bdf1-config\") pod \"service-ca-operator-d6fc45fc5-tt2kq\" (UID: \"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.818070 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.817915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4k4rg\" (UniqueName: \"kubernetes.io/projected/88a79e60-2ccb-4f68-bbaf-9ae07317bdf1-kube-api-access-4k4rg\") pod \"service-ca-operator-d6fc45fc5-tt2kq\" (UID: \"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.821918 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.821532 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:41.825868 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.825782 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88a79e60-2ccb-4f68-bbaf-9ae07317bdf1-config\") pod \"service-ca-operator-d6fc45fc5-tt2kq\" (UID: \"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.826179 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.826102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88a79e60-2ccb-4f68-bbaf-9ae07317bdf1-serving-cert\") pod \"service-ca-operator-d6fc45fc5-tt2kq\" (UID: \"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.840011 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.839954 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k4rg\" (UniqueName: \"kubernetes.io/projected/88a79e60-2ccb-4f68-bbaf-9ae07317bdf1-kube-api-access-4k4rg\") pod \"service-ca-operator-d6fc45fc5-tt2kq\" (UID: \"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.856195 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.855799 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" Apr 22 18:36:41.906311 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.905674 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w"] Apr 22 18:36:41.920885 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.920750 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vllgk"] Apr 22 18:36:41.923023 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.922981 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s"] Apr 22 18:36:41.930092 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.927001 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855"] Apr 22 18:36:41.946057 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.946019 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr"] Apr 22 18:36:41.961979 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:41.961950 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17abeeb9_50cd_4f77_9ee1_e387268f4e5f.slice/crio-422c1ab55697e2cc12c0e195da57a88602381169935c2be48338cfd7e9ba205a WatchSource:0}: Error finding container 422c1ab55697e2cc12c0e195da57a88602381169935c2be48338cfd7e9ba205a: Status 404 returned error can't find the container with id 422c1ab55697e2cc12c0e195da57a88602381169935c2be48338cfd7e9ba205a Apr 22 18:36:41.962958 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:41.962933 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e96d5c9_534c_4e08_b6a6_5f20b407b3e3.slice/crio-55aa03e3559756e8dbc64107f78bb542e5688356fc457c1031f6ab37982627fe WatchSource:0}: Error finding container 55aa03e3559756e8dbc64107f78bb542e5688356fc457c1031f6ab37982627fe: Status 404 returned error can't find the container with id 55aa03e3559756e8dbc64107f78bb542e5688356fc457c1031f6ab37982627fe Apr 22 18:36:41.964640 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:41.964068 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cb93b16_d01c_455e_9091_30190e9bd5b0.slice/crio-05d4341e814a2575c25eae2b2a5a010acce1a105a6450dbb510c00c8f68b9aec WatchSource:0}: Error finding container 05d4341e814a2575c25eae2b2a5a010acce1a105a6450dbb510c00c8f68b9aec: Status 404 returned error can't find the container with id 05d4341e814a2575c25eae2b2a5a010acce1a105a6450dbb510c00c8f68b9aec Apr 22 18:36:41.965608 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:41.965572 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af1833a_5bf4_4b71_9f7b_9ca741fcdc32.slice/crio-f396ac8d003a4673d0b4dd30de092e3f954d502410b5282031c48d9b83e3829d WatchSource:0}: Error finding container f396ac8d003a4673d0b4dd30de092e3f954d502410b5282031c48d9b83e3829d: Status 404 returned error can't find the container with id f396ac8d003a4673d0b4dd30de092e3f954d502410b5282031c48d9b83e3829d Apr 22 18:36:41.967046 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:41.966929 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a55aad3_c17e_4873_8971_af9571fa6014.slice/crio-cb81396398753663c6b9a0b404c8909d1b06b60ce3d65893bf141d76cdf46b3e WatchSource:0}: Error finding container cb81396398753663c6b9a0b404c8909d1b06b60ce3d65893bf141d76cdf46b3e: Status 404 returned error can't find the container with id cb81396398753663c6b9a0b404c8909d1b06b60ce3d65893bf141d76cdf46b3e Apr 22 18:36:41.982920 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.982850 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq"] Apr 22 18:36:41.985813 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:41.985792 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77f0cc31_e890_43ad_a890_116520afc036.slice/crio-dc96a6c69380d04cb176af259cc75f82455e4c73e9d4a21cb6ec168bb3135172 WatchSource:0}: Error finding container dc96a6c69380d04cb176af259cc75f82455e4c73e9d4a21cb6ec168bb3135172: Status 404 returned error can't find the container with id dc96a6c69380d04cb176af259cc75f82455e4c73e9d4a21cb6ec168bb3135172 Apr 22 18:36:41.995799 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.995727 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk"] Apr 22 18:36:41.998221 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:41.998175 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-dxw7s"] Apr 22 18:36:42.003307 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:42.003287 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40336ed8_b40b_4340_9566_2d80600db3d6.slice/crio-744c65565ad2157aeb480cc34a5e34868cc95202bb23bca74d34ab9de546c1f4 WatchSource:0}: Error finding container 744c65565ad2157aeb480cc34a5e34868cc95202bb23bca74d34ab9de546c1f4: Status 404 returned error can't find the container with id 744c65565ad2157aeb480cc34a5e34868cc95202bb23bca74d34ab9de546c1f4 Apr 22 18:36:42.004657 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:42.004529 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79c52d40_7d6b_4114_9d7d_80fa7fd19bf2.slice/crio-d07bae31aee05e8c139ae38d2fd322482f4e69f8e0c4956c43297a62eb96e279 WatchSource:0}: Error finding container d07bae31aee05e8c139ae38d2fd322482f4e69f8e0c4956c43297a62eb96e279: Status 404 returned error can't find the container with id d07bae31aee05e8c139ae38d2fd322482f4e69f8e0c4956c43297a62eb96e279 Apr 22 18:36:42.023112 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.023081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w" event={"ID":"17abeeb9-50cd-4f77-9ee1-e387268f4e5f","Type":"ContainerStarted","Data":"422c1ab55697e2cc12c0e195da57a88602381169935c2be48338cfd7e9ba205a"} Apr 22 18:36:42.024192 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.024168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" event={"ID":"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3","Type":"ContainerStarted","Data":"55aa03e3559756e8dbc64107f78bb542e5688356fc457c1031f6ab37982627fe"} Apr 22 18:36:42.025267 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.025222 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dxw7s" event={"ID":"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2","Type":"ContainerStarted","Data":"d07bae31aee05e8c139ae38d2fd322482f4e69f8e0c4956c43297a62eb96e279"} Apr 22 18:36:42.026587 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.026566 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sdrsv" event={"ID":"71861e02-68a2-47be-ad90-6fd2d00d058b","Type":"ContainerStarted","Data":"ae73fc0494d6f1aa17dc24c03f676118d5ad43d1846f6f5ed6c67e3b2ad2cd2e"} Apr 22 18:36:42.026682 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.026595 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sdrsv" event={"ID":"71861e02-68a2-47be-ad90-6fd2d00d058b","Type":"ContainerStarted","Data":"1cd739e09ba8cf73a087dda59669b27be45eef6835c07ead0ee95e5b4d2fe055"} Apr 22 18:36:42.027562 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.027524 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" event={"ID":"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32","Type":"ContainerStarted","Data":"f396ac8d003a4673d0b4dd30de092e3f954d502410b5282031c48d9b83e3829d"} Apr 22 18:36:42.028370 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.028354 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" event={"ID":"8a55aad3-c17e-4873-8971-af9571fa6014","Type":"ContainerStarted","Data":"cb81396398753663c6b9a0b404c8909d1b06b60ce3d65893bf141d76cdf46b3e"} Apr 22 18:36:42.029333 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.029309 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk" event={"ID":"40336ed8-b40b-4340-9566-2d80600db3d6","Type":"ContainerStarted","Data":"744c65565ad2157aeb480cc34a5e34868cc95202bb23bca74d34ab9de546c1f4"} Apr 22 18:36:42.030527 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.030496 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" event={"ID":"77f0cc31-e890-43ad-a890-116520afc036","Type":"ContainerStarted","Data":"dc96a6c69380d04cb176af259cc75f82455e4c73e9d4a21cb6ec168bb3135172"} Apr 22 18:36:42.031818 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.031784 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" event={"ID":"6cb93b16-d01c-455e-9091-30190e9bd5b0","Type":"ContainerStarted","Data":"05d4341e814a2575c25eae2b2a5a010acce1a105a6450dbb510c00c8f68b9aec"} Apr 22 18:36:42.043061 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.042947 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sdrsv" podStartSLOduration=3.042882093 podStartE2EDuration="3.042882093s" podCreationTimestamp="2026-04-22 18:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:42.042673279 +0000 UTC m=+34.611175087" watchObservedRunningTime="2026-04-22 18:36:42.042882093 +0000 UTC m=+34.611383899" Apr 22 18:36:42.121010 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.120981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:42.121178 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:42.121154 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:36:42.121259 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:42.121243 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls podName:b75815e5-b1cc-44af-8c53-32f3fc0feaec nodeName:}" failed. No retries permitted until 2026-04-22 18:36:43.121206494 +0000 UTC m=+35.689708304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fljkh" (UID: "b75815e5-b1cc-44af-8c53-32f3fc0feaec") : secret "samples-operator-tls" not found Apr 22 18:36:42.134804 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.134776 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq"] Apr 22 18:36:42.138204 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:42.138173 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88a79e60_2ccb_4f68_bbaf_9ae07317bdf1.slice/crio-0514ae45299671bad2f7e8ead7040ac0e03d76bb95c23a75bff8d8933b28022c WatchSource:0}: Error finding container 0514ae45299671bad2f7e8ead7040ac0e03d76bb95c23a75bff8d8933b28022c: Status 404 returned error can't find the container with id 0514ae45299671bad2f7e8ead7040ac0e03d76bb95c23a75bff8d8933b28022c Apr 22 18:36:42.147500 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.147468 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-p756f"] Apr 22 18:36:42.150126 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:42.150101 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf632af1b_67e1_4b4d_9446_ea503297edd6.slice/crio-bf483f446293a201252ba576e1cea51ad0e54c0e5ad9447b2d78ae477a998d61 WatchSource:0}: Error finding container bf483f446293a201252ba576e1cea51ad0e54c0e5ad9447b2d78ae477a998d61: Status 404 returned error can't find the container with id bf483f446293a201252ba576e1cea51ad0e54c0e5ad9447b2d78ae477a998d61 Apr 22 18:36:42.222311 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.222286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:42.222455 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:42.222439 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:43.222420695 +0000 UTC m=+35.790922484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : configmap references non-existent config key: service-ca.crt Apr 22 18:36:42.222518 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.222470 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:42.222574 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:42.222547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:42.222630 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:42.222609 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:42.222679 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:42.222637 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:36:42.222679 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:42.222662 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls podName:d89211d4-cc09-4e34-bb17-86aaf93fef39 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:43.222650358 +0000 UTC m=+35.791152143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-dbp2h" (UID: "d89211d4-cc09-4e34-bb17-86aaf93fef39") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:42.222679 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:42.222675 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:43.222668665 +0000 UTC m=+35.791170449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : secret "router-metrics-certs-default" not found Apr 22 18:36:43.041617 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.041579 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" event={"ID":"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1","Type":"ContainerStarted","Data":"0514ae45299671bad2f7e8ead7040ac0e03d76bb95c23a75bff8d8933b28022c"} Apr 22 18:36:43.045778 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.045748 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-p756f" event={"ID":"f632af1b-67e1-4b4d-9446-ea503297edd6","Type":"ContainerStarted","Data":"bf483f446293a201252ba576e1cea51ad0e54c0e5ad9447b2d78ae477a998d61"} Apr 22 18:36:43.050148 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.050123 2577 generic.go:358] "Generic (PLEG): container finished" podID="447d848d-6ef3-4b39-a91c-16579bc83c6d" containerID="78bba9030751c99be50846ca6a8f1e87c589bc4b11ae70e40d6b415c0997d424" exitCode=0 Apr 22 18:36:43.050268 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.050164 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" event={"ID":"447d848d-6ef3-4b39-a91c-16579bc83c6d","Type":"ContainerDied","Data":"78bba9030751c99be50846ca6a8f1e87c589bc4b11ae70e40d6b415c0997d424"} Apr 22 18:36:43.133051 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.132469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:43.133051 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.132553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:43.133051 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.132645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:43.133051 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.132870 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:36:43.133051 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.132936 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls podName:b75815e5-b1cc-44af-8c53-32f3fc0feaec nodeName:}" failed. No retries permitted until 2026-04-22 18:36:45.132915631 +0000 UTC m=+37.701417421 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fljkh" (UID: "b75815e5-b1cc-44af-8c53-32f3fc0feaec") : secret "samples-operator-tls" not found Apr 22 18:36:43.133421 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.133343 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:43.133421 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.133358 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56667fd489-p9kvx: secret "image-registry-tls" not found Apr 22 18:36:43.133421 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.133398 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls podName:171e80c5-873b-40fc-b152-1a96147b240e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.133384408 +0000 UTC m=+39.701886199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls") pod "image-registry-56667fd489-p9kvx" (UID: "171e80c5-873b-40fc-b152-1a96147b240e") : secret "image-registry-tls" not found Apr 22 18:36:43.133674 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.133556 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:43.133674 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.133597 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert podName:84a0c8c1-7bbb-41ec-86d3-1c379b8789ff nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.133583354 +0000 UTC m=+39.702085139 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert") pod "ingress-canary-vtrxb" (UID: "84a0c8c1-7bbb-41ec-86d3-1c379b8789ff") : secret "canary-serving-cert" not found Apr 22 18:36:43.234347 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.234313 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:43.234509 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.234373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:43.234509 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.234477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:43.234636 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:43.234515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:43.235246 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.235212 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:36:43.235363 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.235290 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:45.235270885 +0000 UTC m=+37.803772669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : secret "router-metrics-certs-default" not found Apr 22 18:36:43.235697 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.235680 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:45.235664234 +0000 UTC m=+37.804166036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : configmap references non-existent config key: service-ca.crt Apr 22 18:36:43.236596 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.236241 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:43.236596 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.236288 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls podName:25870685-4492-459c-a0e2-82a05d74127a nodeName:}" failed. No retries permitted until 2026-04-22 18:36:47.236273846 +0000 UTC m=+39.804775637 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls") pod "dns-default-vc87s" (UID: "25870685-4492-459c-a0e2-82a05d74127a") : secret "dns-default-metrics-tls" not found Apr 22 18:36:43.236596 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.236349 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:43.236596 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:43.236381 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls podName:d89211d4-cc09-4e34-bb17-86aaf93fef39 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:45.236370421 +0000 UTC m=+37.804872209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-dbp2h" (UID: "d89211d4-cc09-4e34-bb17-86aaf93fef39") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:44.064851 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:44.064809 2577 generic.go:358] "Generic (PLEG): container finished" podID="447d848d-6ef3-4b39-a91c-16579bc83c6d" containerID="bcc882d956584e0c4bb5123cba67436bb16d648539977bc60ff717fd16d45129" exitCode=0 Apr 22 18:36:44.065488 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:44.064893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" event={"ID":"447d848d-6ef3-4b39-a91c-16579bc83c6d","Type":"ContainerDied","Data":"bcc882d956584e0c4bb5123cba67436bb16d648539977bc60ff717fd16d45129"} Apr 22 18:36:45.157138 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:45.157087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:45.157718 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:45.157481 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:36:45.157718 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:45.157544 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls podName:b75815e5-b1cc-44af-8c53-32f3fc0feaec nodeName:}" failed. No retries permitted until 2026-04-22 18:36:49.157525705 +0000 UTC m=+41.726027494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fljkh" (UID: "b75815e5-b1cc-44af-8c53-32f3fc0feaec") : secret "samples-operator-tls" not found Apr 22 18:36:45.259306 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:45.257978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:45.259306 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:45.258349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:45.259306 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:45.258443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:45.259306 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:45.258571 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:36:45.259306 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:45.258640 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:49.258621794 +0000 UTC m=+41.827123582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : secret "router-metrics-certs-default" not found Apr 22 18:36:45.259306 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:45.259148 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:49.259131751 +0000 UTC m=+41.827633549 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : configmap references non-existent config key: service-ca.crt Apr 22 18:36:45.259306 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:45.259219 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:45.259306 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:45.259271 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls podName:d89211d4-cc09-4e34-bb17-86aaf93fef39 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:49.259259746 +0000 UTC m=+41.827761536 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-dbp2h" (UID: "d89211d4-cc09-4e34-bb17-86aaf93fef39") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:46.691142 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:46.691097 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:46.699601 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:46.699543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3a1c2eb-91b0-417e-818b-e08b94eca20e-original-pull-secret\") pod \"global-pull-secret-syncer-fbr4l\" (UID: \"b3a1c2eb-91b0-417e-818b-e08b94eca20e\") " pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:46.925884 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:46.925805 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fbr4l" Apr 22 18:36:47.194568 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:47.194536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:47.194743 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:47.194623 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:47.194809 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:47.194739 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:47.194809 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:47.194757 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:47.194809 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:47.194763 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56667fd489-p9kvx: secret "image-registry-tls" not found Apr 22 18:36:47.194958 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:47.194817 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert podName:84a0c8c1-7bbb-41ec-86d3-1c379b8789ff nodeName:}" failed. No retries permitted until 2026-04-22 18:36:55.194802986 +0000 UTC m=+47.763304773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert") pod "ingress-canary-vtrxb" (UID: "84a0c8c1-7bbb-41ec-86d3-1c379b8789ff") : secret "canary-serving-cert" not found Apr 22 18:36:47.194958 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:47.194832 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls podName:171e80c5-873b-40fc-b152-1a96147b240e nodeName:}" failed. No retries permitted until 2026-04-22 18:36:55.19482574 +0000 UTC m=+47.763327525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls") pod "image-registry-56667fd489-p9kvx" (UID: "171e80c5-873b-40fc-b152-1a96147b240e") : secret "image-registry-tls" not found Apr 22 18:36:47.295807 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:47.295776 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:47.295967 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:47.295892 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:47.295967 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:47.295949 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls podName:25870685-4492-459c-a0e2-82a05d74127a nodeName:}" failed. No retries permitted until 2026-04-22 18:36:55.295935128 +0000 UTC m=+47.864436913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls") pod "dns-default-vc87s" (UID: "25870685-4492-459c-a0e2-82a05d74127a") : secret "dns-default-metrics-tls" not found Apr 22 18:36:49.210913 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:49.210872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:49.211357 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:49.211009 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:36:49.211357 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:49.211069 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls podName:b75815e5-b1cc-44af-8c53-32f3fc0feaec nodeName:}" failed. No retries permitted until 2026-04-22 18:36:57.211054468 +0000 UTC m=+49.779556253 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fljkh" (UID: "b75815e5-b1cc-44af-8c53-32f3fc0feaec") : secret "samples-operator-tls" not found Apr 22 18:36:49.311790 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:49.311754 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:49.311958 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:49.311803 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:49.311958 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:49.311922 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:36:49.311958 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:49.311941 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:57.311927991 +0000 UTC m=+49.880429775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : configmap references non-existent config key: service-ca.crt Apr 22 18:36:49.312133 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:49.311984 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:57.311967268 +0000 UTC m=+49.880469056 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : secret "router-metrics-certs-default" not found Apr 22 18:36:49.312133 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:49.312039 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:49.312247 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:49.312212 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:49.312306 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:49.312295 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls podName:d89211d4-cc09-4e34-bb17-86aaf93fef39 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:57.312280389 +0000 UTC m=+49.880782192 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-dbp2h" (UID: "d89211d4-cc09-4e34-bb17-86aaf93fef39") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:54.832027 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:54.831999 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fbr4l"] Apr 22 18:36:54.843320 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:36:54.843288 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3a1c2eb_91b0_417e_818b_e08b94eca20e.slice/crio-0b24bc62a742a977d54d6d11a9d25bd959b4027e164dcab850451b1abc05d545 WatchSource:0}: Error finding container 0b24bc62a742a977d54d6d11a9d25bd959b4027e164dcab850451b1abc05d545: Status 404 returned error can't find the container with id 0b24bc62a742a977d54d6d11a9d25bd959b4027e164dcab850451b1abc05d545 Apr 22 18:36:55.094011 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.093974 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk" event={"ID":"40336ed8-b40b-4340-9566-2d80600db3d6","Type":"ContainerStarted","Data":"29709c7920ab7445bcc308149752c08d84209a40090115dedbc4510d1ab8e63d"} Apr 22 18:36:55.097734 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.097698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" event={"ID":"447d848d-6ef3-4b39-a91c-16579bc83c6d","Type":"ContainerStarted","Data":"1d393607d53aef65db30498a7a7f94e86c408632346c33c4725bd559de09b9dc"} Apr 22 18:36:55.099241 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.099200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" event={"ID":"77f0cc31-e890-43ad-a890-116520afc036","Type":"ContainerStarted","Data":"33272609056c699e29136cb9d425e6c36b55c069e1086151588df1b58362f937"} Apr 22 18:36:55.100617 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.100594 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" event={"ID":"6cb93b16-d01c-455e-9091-30190e9bd5b0","Type":"ContainerStarted","Data":"9bb3683e7f8b5d01c104eb084673b8e71bb49ff2582ebbe767c15779d15789c3"} Apr 22 18:36:55.101860 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.101840 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" event={"ID":"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1","Type":"ContainerStarted","Data":"22f3b830929153400af89c61d2f6f99cfb31b91728e7865eba9269ddaca62d58"} Apr 22 18:36:55.103044 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.103026 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w" event={"ID":"17abeeb9-50cd-4f77-9ee1-e387268f4e5f","Type":"ContainerStarted","Data":"72059a6513fd7ce4738b2f5fa664598e069af38057cb5cdc0349c17733814eed"} Apr 22 18:36:55.104277 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.104257 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fbr4l" event={"ID":"b3a1c2eb-91b0-417e-818b-e08b94eca20e","Type":"ContainerStarted","Data":"0b24bc62a742a977d54d6d11a9d25bd959b4027e164dcab850451b1abc05d545"} Apr 22 18:36:55.105666 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.105651 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/0.log" Apr 22 18:36:55.105742 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.105677 2577 generic.go:358] "Generic (PLEG): container finished" podID="5e96d5c9-534c-4e08-b6a6-5f20b407b3e3" containerID="5263066afb06c4db1250a1fc6c1016b8a509fa52a402ca60412a94e7e14e2026" exitCode=255 Apr 22 18:36:55.105742 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.105727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" event={"ID":"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3","Type":"ContainerDied","Data":"5263066afb06c4db1250a1fc6c1016b8a509fa52a402ca60412a94e7e14e2026"} Apr 22 18:36:55.105912 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.105899 2577 scope.go:117] "RemoveContainer" containerID="5263066afb06c4db1250a1fc6c1016b8a509fa52a402ca60412a94e7e14e2026" Apr 22 18:36:55.107522 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.107502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dxw7s" event={"ID":"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2","Type":"ContainerStarted","Data":"3bb3634f7ef385c28c9e42070f9a4c40c1dbcd2628391ce8513414b4ab59a55d"} Apr 22 18:36:55.109243 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.109207 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" event={"ID":"3af1833a-5bf4-4b71-9f7b-9ca741fcdc32","Type":"ContainerStarted","Data":"765fa032795e1d9cdd78c309a304ce71a5d54abaabf26e1dfc206ac766a03de6"} Apr 22 18:36:55.109531 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.109514 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:55.110900 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.110849 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" event={"ID":"8a55aad3-c17e-4873-8971-af9571fa6014","Type":"ContainerStarted","Data":"8546f65c0ea26ad14b8655cb77387796b801c6a34269da831636c503711e7d2b"} Apr 22 18:36:55.112113 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.112068 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-wpxbk" podStartSLOduration=1.408102566 podStartE2EDuration="14.112053373s" podCreationTimestamp="2026-04-22 18:36:41 +0000 UTC" firstStartedPulling="2026-04-22 18:36:42.006433038 +0000 UTC m=+34.574934824" lastFinishedPulling="2026-04-22 18:36:54.710383832 +0000 UTC m=+47.278885631" observedRunningTime="2026-04-22 18:36:55.110530055 +0000 UTC m=+47.679031864" watchObservedRunningTime="2026-04-22 18:36:55.112053373 +0000 UTC m=+47.680555185" Apr 22 18:36:55.120691 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.120668 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" Apr 22 18:36:55.120791 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.120698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-p756f" event={"ID":"f632af1b-67e1-4b4d-9446-ea503297edd6","Type":"ContainerStarted","Data":"c11109e6325c64f03d1dfb46d88f90569ad8838c0c5a74fcf37cc6f32dbfabf5"} Apr 22 18:36:55.120868 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.120852 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:36:55.148424 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.148385 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" podStartSLOduration=1.432623605 podStartE2EDuration="14.148371228s" podCreationTimestamp="2026-04-22 18:36:41 +0000 UTC" firstStartedPulling="2026-04-22 18:36:41.990027795 +0000 UTC m=+34.558529581" lastFinishedPulling="2026-04-22 18:36:54.705775405 +0000 UTC m=+47.274277204" observedRunningTime="2026-04-22 18:36:55.148258435 +0000 UTC m=+47.716760239" watchObservedRunningTime="2026-04-22 18:36:55.148371228 +0000 UTC m=+47.716873035" Apr 22 18:36:55.179713 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.179657 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" podStartSLOduration=1.673044786 podStartE2EDuration="14.179635998s" podCreationTimestamp="2026-04-22 18:36:41 +0000 UTC" firstStartedPulling="2026-04-22 18:36:42.140613211 +0000 UTC m=+34.709114996" lastFinishedPulling="2026-04-22 18:36:54.647204415 +0000 UTC m=+47.215706208" observedRunningTime="2026-04-22 18:36:55.16896762 +0000 UTC m=+47.737469428" watchObservedRunningTime="2026-04-22 18:36:55.179635998 +0000 UTC m=+47.748137807" Apr 22 18:36:55.209442 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.209405 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dqfsv" podStartSLOduration=15.803931139 podStartE2EDuration="47.209391726s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="2026-04-22 18:36:10.619366173 +0000 UTC m=+3.187867961" lastFinishedPulling="2026-04-22 18:36:42.024826749 +0000 UTC m=+34.593328548" observedRunningTime="2026-04-22 18:36:55.207736495 +0000 UTC m=+47.776238325" watchObservedRunningTime="2026-04-22 18:36:55.209391726 +0000 UTC m=+47.777893533" Apr 22 18:36:55.234449 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.234395 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-754879b4d5-4fp2s" podStartSLOduration=14.519912737 podStartE2EDuration="27.234377088s" podCreationTimestamp="2026-04-22 18:36:28 +0000 UTC" firstStartedPulling="2026-04-22 18:36:41.99003804 +0000 UTC m=+34.558539840" lastFinishedPulling="2026-04-22 18:36:54.704502376 +0000 UTC m=+47.273004191" observedRunningTime="2026-04-22 18:36:55.232320968 +0000 UTC m=+47.800822777" watchObservedRunningTime="2026-04-22 18:36:55.234377088 +0000 UTC m=+47.802878896" Apr 22 18:36:55.258930 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.258816 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5c4556f8d4-qd855" podStartSLOduration=14.539777302 podStartE2EDuration="27.258803782s" podCreationTimestamp="2026-04-22 18:36:28 +0000 UTC" firstStartedPulling="2026-04-22 18:36:41.989890315 +0000 UTC m=+34.558392114" lastFinishedPulling="2026-04-22 18:36:54.70891679 +0000 UTC m=+47.277418594" observedRunningTime="2026-04-22 18:36:55.257650963 +0000 UTC m=+47.826152771" watchObservedRunningTime="2026-04-22 18:36:55.258803782 +0000 UTC m=+47.827305589" Apr 22 18:36:55.263895 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.263873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:36:55.264013 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.263961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:36:55.264256 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:55.264159 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:55.264256 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:55.264210 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:36:55.264256 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:55.264224 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56667fd489-p9kvx: secret "image-registry-tls" not found Apr 22 18:36:55.264458 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:55.264214 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert podName:84a0c8c1-7bbb-41ec-86d3-1c379b8789ff nodeName:}" failed. No retries permitted until 2026-04-22 18:37:11.264199314 +0000 UTC m=+63.832701108 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert") pod "ingress-canary-vtrxb" (UID: "84a0c8c1-7bbb-41ec-86d3-1c379b8789ff") : secret "canary-serving-cert" not found Apr 22 18:36:55.264458 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:55.264299 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls podName:171e80c5-873b-40fc-b152-1a96147b240e nodeName:}" failed. No retries permitted until 2026-04-22 18:37:11.264286126 +0000 UTC m=+63.832787912 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls") pod "image-registry-56667fd489-p9kvx" (UID: "171e80c5-873b-40fc-b152-1a96147b240e") : secret "image-registry-tls" not found Apr 22 18:36:55.273330 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.273285 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rmv4w" podStartSLOduration=1.6908674590000001 podStartE2EDuration="14.273270699s" podCreationTimestamp="2026-04-22 18:36:41 +0000 UTC" firstStartedPulling="2026-04-22 18:36:41.964759776 +0000 UTC m=+34.533261569" lastFinishedPulling="2026-04-22 18:36:54.547163025 +0000 UTC m=+47.115664809" observedRunningTime="2026-04-22 18:36:55.272397996 +0000 UTC m=+47.840899806" watchObservedRunningTime="2026-04-22 18:36:55.273270699 +0000 UTC m=+47.841772508" Apr 22 18:36:55.321912 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.320836 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-p756f" podStartSLOduration=34.757439733 podStartE2EDuration="47.320817355s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="2026-04-22 18:36:42.155752591 +0000 UTC m=+34.724254379" lastFinishedPulling="2026-04-22 18:36:54.719130215 +0000 UTC m=+47.287632001" observedRunningTime="2026-04-22 18:36:55.318306481 +0000 UTC m=+47.886808286" watchObservedRunningTime="2026-04-22 18:36:55.320817355 +0000 UTC m=+47.889319162" Apr 22 18:36:55.321912 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.321559 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-dxw7s" podStartSLOduration=1.6240243909999998 podStartE2EDuration="14.321548759s" podCreationTimestamp="2026-04-22 18:36:41 +0000 UTC" firstStartedPulling="2026-04-22 18:36:42.006521687 +0000 UTC m=+34.575023479" lastFinishedPulling="2026-04-22 18:36:54.704046062 +0000 UTC m=+47.272547847" observedRunningTime="2026-04-22 18:36:55.293668946 +0000 UTC m=+47.862170756" watchObservedRunningTime="2026-04-22 18:36:55.321548759 +0000 UTC m=+47.890050567" Apr 22 18:36:55.365804 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:55.365079 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:36:55.365804 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:55.365284 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:55.365804 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:55.365339 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls podName:25870685-4492-459c-a0e2-82a05d74127a nodeName:}" failed. No retries permitted until 2026-04-22 18:37:11.365321849 +0000 UTC m=+63.933823641 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls") pod "dns-default-vc87s" (UID: "25870685-4492-459c-a0e2-82a05d74127a") : secret "dns-default-metrics-tls" not found Apr 22 18:36:56.118376 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:56.118296 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 18:36:56.119125 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:56.119101 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/0.log" Apr 22 18:36:56.119251 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:56.119137 2577 generic.go:358] "Generic (PLEG): container finished" podID="5e96d5c9-534c-4e08-b6a6-5f20b407b3e3" containerID="4e5dce6b061f38cf86cc00ab99946644be69127d9a6ad7426975219ae1264a0c" exitCode=255 Apr 22 18:36:56.119943 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:56.119925 2577 scope.go:117] "RemoveContainer" containerID="4e5dce6b061f38cf86cc00ab99946644be69127d9a6ad7426975219ae1264a0c" Apr 22 18:36:56.120129 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:56.120100 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-vllgk_openshift-console-operator(5e96d5c9-534c-4e08-b6a6-5f20b407b3e3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" podUID="5e96d5c9-534c-4e08-b6a6-5f20b407b3e3" Apr 22 18:36:56.120360 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:56.120340 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" event={"ID":"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3","Type":"ContainerDied","Data":"4e5dce6b061f38cf86cc00ab99946644be69127d9a6ad7426975219ae1264a0c"} Apr 22 18:36:56.120429 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:56.120377 2577 scope.go:117] "RemoveContainer" containerID="5263066afb06c4db1250a1fc6c1016b8a509fa52a402ca60412a94e7e14e2026" Apr 22 18:36:57.124403 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:57.124372 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 18:36:57.125035 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:57.125014 2577 scope.go:117] "RemoveContainer" containerID="4e5dce6b061f38cf86cc00ab99946644be69127d9a6ad7426975219ae1264a0c" Apr 22 18:36:57.125279 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:57.125221 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-vllgk_openshift-console-operator(5e96d5c9-534c-4e08-b6a6-5f20b407b3e3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" podUID="5e96d5c9-534c-4e08-b6a6-5f20b407b3e3" Apr 22 18:36:57.287583 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:57.287542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:36:57.287770 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:57.287694 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:36:57.287848 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:57.287791 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls podName:b75815e5-b1cc-44af-8c53-32f3fc0feaec nodeName:}" failed. No retries permitted until 2026-04-22 18:37:13.287741424 +0000 UTC m=+65.856243215 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fljkh" (UID: "b75815e5-b1cc-44af-8c53-32f3fc0feaec") : secret "samples-operator-tls" not found Apr 22 18:36:57.389017 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:57.388937 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:36:57.389179 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:57.389037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:57.389179 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:57.389077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:36:57.389179 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:57.389089 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:57.389179 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:57.389156 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:36:57.389179 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:57.389169 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls podName:d89211d4-cc09-4e34-bb17-86aaf93fef39 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:13.389146431 +0000 UTC m=+65.957648240 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-dbp2h" (UID: "d89211d4-cc09-4e34-bb17-86aaf93fef39") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:36:57.389501 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:57.389194 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:13.389183974 +0000 UTC m=+65.957685759 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : secret "router-metrics-certs-default" not found Apr 22 18:36:57.389501 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:36:57.389209 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle podName:94003369-e42a-4e67-ab30-f6abbd37dcc7 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:13.389201652 +0000 UTC m=+65.957703437 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle") pod "router-default-7786869f5d-r5qgc" (UID: "94003369-e42a-4e67-ab30-f6abbd37dcc7") : configmap references non-existent config key: service-ca.crt Apr 22 18:36:58.128783 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:58.128691 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" event={"ID":"8a55aad3-c17e-4873-8971-af9571fa6014","Type":"ContainerStarted","Data":"d28c1faf34d415bd9440c16aefba2e5643014ea59c9894e7df752df549b5dd90"} Apr 22 18:36:58.539470 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:58.539441 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sdrsv_71861e02-68a2-47be-ad90-6fd2d00d058b/dns-node-resolver/0.log" Apr 22 18:36:59.740506 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:36:59.740481 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wm9hb_938b317c-4e75-4cee-9219-836d71fde87b/node-ca/0.log" Apr 22 18:37:00.134874 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:00.134839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fbr4l" event={"ID":"b3a1c2eb-91b0-417e-818b-e08b94eca20e","Type":"ContainerStarted","Data":"9fa9a24062077bd373a34be7299892d1166cd874b4b3fdd0eb60ad04bc583f55"} Apr 22 18:37:00.136539 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:00.136513 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" event={"ID":"8a55aad3-c17e-4873-8971-af9571fa6014","Type":"ContainerStarted","Data":"273a9e47083486417cb9cbe774602a2231119a97f34007746b1f83c3f8d209c9"} Apr 22 18:37:00.150871 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:00.150833 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fbr4l" podStartSLOduration=41.289547677 podStartE2EDuration="46.150821771s" podCreationTimestamp="2026-04-22 18:36:14 +0000 UTC" firstStartedPulling="2026-04-22 18:36:54.845556111 +0000 UTC m=+47.414057902" lastFinishedPulling="2026-04-22 18:36:59.706830208 +0000 UTC m=+52.275331996" observedRunningTime="2026-04-22 18:37:00.150814407 +0000 UTC m=+52.719316208" watchObservedRunningTime="2026-04-22 18:37:00.150821771 +0000 UTC m=+52.719323578" Apr 22 18:37:00.173611 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:00.173567 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" podStartSLOduration=16.332144809 podStartE2EDuration="32.173554054s" podCreationTimestamp="2026-04-22 18:36:28 +0000 UTC" firstStartedPulling="2026-04-22 18:36:41.989807666 +0000 UTC m=+34.558309464" lastFinishedPulling="2026-04-22 18:36:57.83121692 +0000 UTC m=+50.399718709" observedRunningTime="2026-04-22 18:37:00.17283724 +0000 UTC m=+52.741339083" watchObservedRunningTime="2026-04-22 18:37:00.173554054 +0000 UTC m=+52.742055862" Apr 22 18:37:01.657628 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:01.657584 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:37:01.657628 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:01.657626 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:37:01.658094 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:01.657965 2577 scope.go:117] "RemoveContainer" containerID="4e5dce6b061f38cf86cc00ab99946644be69127d9a6ad7426975219ae1264a0c" Apr 22 18:37:01.658130 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:37:01.658117 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-vllgk_openshift-console-operator(5e96d5c9-534c-4e08-b6a6-5f20b407b3e3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" podUID="5e96d5c9-534c-4e08-b6a6-5f20b407b3e3" Apr 22 18:37:07.022242 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:07.022194 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pr85c" Apr 22 18:37:11.326311 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.326282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:37:11.326883 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.326380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:37:11.328644 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.328625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") pod \"image-registry-56667fd489-p9kvx\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:37:11.328698 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.328653 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84a0c8c1-7bbb-41ec-86d3-1c379b8789ff-cert\") pod \"ingress-canary-vtrxb\" (UID: \"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff\") " pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:37:11.426962 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.426920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:37:11.428882 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.428865 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25870685-4492-459c-a0e2-82a05d74127a-metrics-tls\") pod \"dns-default-vc87s\" (UID: \"25870685-4492-459c-a0e2-82a05d74127a\") " pod="openshift-dns/dns-default-vc87s" Apr 22 18:37:11.473087 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.473066 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8gvfs\"" Apr 22 18:37:11.481419 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.481403 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:37:11.490028 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.489993 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-x8r7b\"" Apr 22 18:37:11.498544 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.498518 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vtrxb" Apr 22 18:37:11.530251 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.530180 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xfhxc\"" Apr 22 18:37:11.538391 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.538363 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vc87s" Apr 22 18:37:11.619546 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.619516 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56667fd489-p9kvx"] Apr 22 18:37:11.625198 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:37:11.625172 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod171e80c5_873b_40fc_b152_1a96147b240e.slice/crio-c446098cf47a098cd24cb4503c504c388c77c9f5801998580f5b2fcaccd4a592 WatchSource:0}: Error finding container c446098cf47a098cd24cb4503c504c388c77c9f5801998580f5b2fcaccd4a592: Status 404 returned error can't find the container with id c446098cf47a098cd24cb4503c504c388c77c9f5801998580f5b2fcaccd4a592 Apr 22 18:37:11.632789 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.632762 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vtrxb"] Apr 22 18:37:11.636688 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:37:11.636654 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a0c8c1_7bbb_41ec_86d3_1c379b8789ff.slice/crio-19c6c329e67b72fe7ce2337575cc4a3c9d528ba3c9764c476f96207a69488b45 WatchSource:0}: Error finding container 19c6c329e67b72fe7ce2337575cc4a3c9d528ba3c9764c476f96207a69488b45: Status 404 returned error can't find the container with id 19c6c329e67b72fe7ce2337575cc4a3c9d528ba3c9764c476f96207a69488b45 Apr 22 18:37:11.672421 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:11.672394 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vc87s"] Apr 22 18:37:11.675147 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:37:11.675123 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25870685_4492_459c_a0e2_82a05d74127a.slice/crio-bd9cd26a4d5e06f61d4f2372cdf44d00c2d00094f94205772f4249c5d268f9d8 WatchSource:0}: Error finding container bd9cd26a4d5e06f61d4f2372cdf44d00c2d00094f94205772f4249c5d268f9d8: Status 404 returned error can't find the container with id bd9cd26a4d5e06f61d4f2372cdf44d00c2d00094f94205772f4249c5d268f9d8 Apr 22 18:37:12.171905 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:12.171860 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vc87s" event={"ID":"25870685-4492-459c-a0e2-82a05d74127a","Type":"ContainerStarted","Data":"bd9cd26a4d5e06f61d4f2372cdf44d00c2d00094f94205772f4249c5d268f9d8"} Apr 22 18:37:12.173249 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:12.173194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vtrxb" event={"ID":"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff","Type":"ContainerStarted","Data":"19c6c329e67b72fe7ce2337575cc4a3c9d528ba3c9764c476f96207a69488b45"} Apr 22 18:37:12.174765 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:12.174743 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" event={"ID":"171e80c5-873b-40fc-b152-1a96147b240e","Type":"ContainerStarted","Data":"b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348"} Apr 22 18:37:12.174765 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:12.174772 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" event={"ID":"171e80c5-873b-40fc-b152-1a96147b240e","Type":"ContainerStarted","Data":"c446098cf47a098cd24cb4503c504c388c77c9f5801998580f5b2fcaccd4a592"} Apr 22 18:37:12.174950 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:12.174908 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:37:12.197994 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:12.197956 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" podStartSLOduration=64.197941917 podStartE2EDuration="1m4.197941917s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:37:12.197770201 +0000 UTC m=+64.766272012" watchObservedRunningTime="2026-04-22 18:37:12.197941917 +0000 UTC m=+64.766443723" Apr 22 18:37:13.341891 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.341852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:37:13.344438 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.344413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b75815e5-b1cc-44af-8c53-32f3fc0feaec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fljkh\" (UID: \"b75815e5-b1cc-44af-8c53-32f3fc0feaec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:37:13.442673 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.442633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:37:13.442822 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.442710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:37:13.442822 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.442733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:37:13.445281 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.445254 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94003369-e42a-4e67-ab30-f6abbd37dcc7-metrics-certs\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:37:13.445390 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.445277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/d89211d4-cc09-4e34-bb17-86aaf93fef39-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dbp2h\" (UID: \"d89211d4-cc09-4e34-bb17-86aaf93fef39\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:37:13.453978 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.453955 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94003369-e42a-4e67-ab30-f6abbd37dcc7-service-ca-bundle\") pod \"router-default-7786869f5d-r5qgc\" (UID: \"94003369-e42a-4e67-ab30-f6abbd37dcc7\") " pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:37:13.520072 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.520047 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-4zsj9\"" Apr 22 18:37:13.528858 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.528840 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" Apr 22 18:37:13.557826 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.557800 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-ds9w2\"" Apr 22 18:37:13.566499 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.566480 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:37:13.636633 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.636574 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-wdqjw\"" Apr 22 18:37:13.644625 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.644604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" Apr 22 18:37:13.644892 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.644860 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:37:13.647533 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.647510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f20c356d-ebd8-4177-92c7-8bf2571249a2-metrics-certs\") pod \"network-metrics-daemon-vqnz5\" (UID: \"f20c356d-ebd8-4177-92c7-8bf2571249a2\") " pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:37:13.915911 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.915839 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-95dwn\"" Apr 22 18:37:13.923594 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:13.923568 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vqnz5" Apr 22 18:37:14.432917 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:14.432876 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh"] Apr 22 18:37:14.450646 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:14.450607 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7786869f5d-r5qgc"] Apr 22 18:37:14.464803 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:14.464764 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h"] Apr 22 18:37:14.470426 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:37:14.470389 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd89211d4_cc09_4e34_bb17_86aaf93fef39.slice/crio-fd2d2368dd4bb1f592275c671a2ada29ee8b8d66af9f1108f2aabbea9418b100 WatchSource:0}: Error finding container fd2d2368dd4bb1f592275c671a2ada29ee8b8d66af9f1108f2aabbea9418b100: Status 404 returned error can't find the container with id fd2d2368dd4bb1f592275c671a2ada29ee8b8d66af9f1108f2aabbea9418b100 Apr 22 18:37:14.483208 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:14.483182 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vqnz5"] Apr 22 18:37:14.488871 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:37:14.488803 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf20c356d_ebd8_4177_92c7_8bf2571249a2.slice/crio-cfddaa9c4e9f6d7553c763cd7653c8de561e9b4229b17302334f8242935cda24 WatchSource:0}: Error finding container cfddaa9c4e9f6d7553c763cd7653c8de561e9b4229b17302334f8242935cda24: Status 404 returned error can't find the container with id cfddaa9c4e9f6d7553c763cd7653c8de561e9b4229b17302334f8242935cda24 Apr 22 18:37:15.186336 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.186292 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7786869f5d-r5qgc" event={"ID":"94003369-e42a-4e67-ab30-f6abbd37dcc7","Type":"ContainerStarted","Data":"b424e46c938b40efd42c8fc00a36e9115be0a0c4a6b88ac65b1589b139b644c8"} Apr 22 18:37:15.186336 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.186341 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7786869f5d-r5qgc" event={"ID":"94003369-e42a-4e67-ab30-f6abbd37dcc7","Type":"ContainerStarted","Data":"0b9b282b1e0eb4bee589a949ef7b39b831076f821dc2b4108aca7879e0497be7"} Apr 22 18:37:15.187979 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.187951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vc87s" event={"ID":"25870685-4492-459c-a0e2-82a05d74127a","Type":"ContainerStarted","Data":"d42b25c6a4319cbbf04c8b0c2fd0e86558d6959b6fe603bfb860aeb27aeabcbd"} Apr 22 18:37:15.188098 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.187983 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vc87s" event={"ID":"25870685-4492-459c-a0e2-82a05d74127a","Type":"ContainerStarted","Data":"9a9dbfce87f4b0603c805ea963b6c33eb28d1c716a585d98bdabe45549978782"} Apr 22 18:37:15.188098 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.188093 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vc87s" Apr 22 18:37:15.189146 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.189119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" event={"ID":"b75815e5-b1cc-44af-8c53-32f3fc0feaec","Type":"ContainerStarted","Data":"e36229eb977963a2ae3106664709e2a557e5ebc941a76bce961501c327ecd17b"} Apr 22 18:37:15.190758 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.190730 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vtrxb" event={"ID":"84a0c8c1-7bbb-41ec-86d3-1c379b8789ff","Type":"ContainerStarted","Data":"51f4f4bcdbf1377b5a6575e005bac477b9b2963253ed15f2d9d5ed33c26ee2c1"} Apr 22 18:37:15.191931 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.191908 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" event={"ID":"d89211d4-cc09-4e34-bb17-86aaf93fef39","Type":"ContainerStarted","Data":"fd2d2368dd4bb1f592275c671a2ada29ee8b8d66af9f1108f2aabbea9418b100"} Apr 22 18:37:15.193097 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.193072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vqnz5" event={"ID":"f20c356d-ebd8-4177-92c7-8bf2571249a2","Type":"ContainerStarted","Data":"cfddaa9c4e9f6d7553c763cd7653c8de561e9b4229b17302334f8242935cda24"} Apr 22 18:37:15.207160 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.207106 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7786869f5d-r5qgc" podStartSLOduration=34.207093056 podStartE2EDuration="34.207093056s" podCreationTimestamp="2026-04-22 18:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:37:15.205539003 +0000 UTC m=+67.774040810" watchObservedRunningTime="2026-04-22 18:37:15.207093056 +0000 UTC m=+67.775594864" Apr 22 18:37:15.223184 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.223146 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vtrxb" podStartSLOduration=33.580971965 podStartE2EDuration="36.223132873s" podCreationTimestamp="2026-04-22 18:36:39 +0000 UTC" firstStartedPulling="2026-04-22 18:37:11.63842418 +0000 UTC m=+64.206925968" lastFinishedPulling="2026-04-22 18:37:14.280585084 +0000 UTC m=+66.849086876" observedRunningTime="2026-04-22 18:37:15.2218872 +0000 UTC m=+67.790389007" watchObservedRunningTime="2026-04-22 18:37:15.223132873 +0000 UTC m=+67.791634679" Apr 22 18:37:15.239894 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.239851 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vc87s" podStartSLOduration=33.637734388 podStartE2EDuration="36.239835228s" podCreationTimestamp="2026-04-22 18:36:39 +0000 UTC" firstStartedPulling="2026-04-22 18:37:11.676899655 +0000 UTC m=+64.245401443" lastFinishedPulling="2026-04-22 18:37:14.279000413 +0000 UTC m=+66.847502283" observedRunningTime="2026-04-22 18:37:15.239478565 +0000 UTC m=+67.807980371" watchObservedRunningTime="2026-04-22 18:37:15.239835228 +0000 UTC m=+67.808337038" Apr 22 18:37:15.567587 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.567549 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:37:15.570402 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:15.570377 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:37:16.198309 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:16.198263 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vqnz5" event={"ID":"f20c356d-ebd8-4177-92c7-8bf2571249a2","Type":"ContainerStarted","Data":"89e40499bf0e8dab693ab1efe0425ac45f63f0d266372d8d7df7b02b3f5ffd99"} Apr 22 18:37:16.198895 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:16.198828 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:37:16.199886 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:16.199866 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7786869f5d-r5qgc" Apr 22 18:37:16.901980 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:16.901893 2577 scope.go:117] "RemoveContainer" containerID="4e5dce6b061f38cf86cc00ab99946644be69127d9a6ad7426975219ae1264a0c" Apr 22 18:37:17.203223 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:17.203136 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vqnz5" event={"ID":"f20c356d-ebd8-4177-92c7-8bf2571249a2","Type":"ContainerStarted","Data":"1928e9f7042ba96751d6c1599039cbb653a539b23f807c4bc6debc48013ef715"} Apr 22 18:37:17.222566 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:17.222472 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vqnz5" podStartSLOduration=67.737368555 podStartE2EDuration="1m9.22245676s" podCreationTimestamp="2026-04-22 18:36:08 +0000 UTC" firstStartedPulling="2026-04-22 18:37:14.491881557 +0000 UTC m=+67.060383342" lastFinishedPulling="2026-04-22 18:37:15.976969759 +0000 UTC m=+68.545471547" observedRunningTime="2026-04-22 18:37:17.221684458 +0000 UTC m=+69.790186267" watchObservedRunningTime="2026-04-22 18:37:17.22245676 +0000 UTC m=+69.790958597" Apr 22 18:37:18.207821 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:18.207781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" event={"ID":"d89211d4-cc09-4e34-bb17-86aaf93fef39","Type":"ContainerStarted","Data":"bb935bc01d7d6ab8159d396c6a60d975fb16834cc465dc4c4c2bbd08fee18cb5"} Apr 22 18:37:18.209541 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:18.209521 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 18:37:18.209661 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:18.209640 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" event={"ID":"5e96d5c9-534c-4e08-b6a6-5f20b407b3e3","Type":"ContainerStarted","Data":"8772c9b78bfd4fffbb6a3d15327f9e040d18bd0d4dbc946290db6ee460e5071c"} Apr 22 18:37:18.209935 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:18.209917 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:37:18.211387 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:18.211359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" event={"ID":"b75815e5-b1cc-44af-8c53-32f3fc0feaec","Type":"ContainerStarted","Data":"9f0293765983c862778b5ff609bba46e192363a27c6d3e9d73f214ef7fa30b25"} Apr 22 18:37:18.211489 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:18.211404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" event={"ID":"b75815e5-b1cc-44af-8c53-32f3fc0feaec","Type":"ContainerStarted","Data":"12dfb60eb1e917df0b1bc0a26eb631556b78a1fee09b32fdce81dd99366875c3"} Apr 22 18:37:18.227499 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:18.227456 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" podStartSLOduration=34.295458365 podStartE2EDuration="37.227444811s" podCreationTimestamp="2026-04-22 18:36:41 +0000 UTC" firstStartedPulling="2026-04-22 18:37:14.472806553 +0000 UTC m=+67.041308338" lastFinishedPulling="2026-04-22 18:37:17.404792995 +0000 UTC m=+69.973294784" observedRunningTime="2026-04-22 18:37:18.226212264 +0000 UTC m=+70.794714071" watchObservedRunningTime="2026-04-22 18:37:18.227444811 +0000 UTC m=+70.795946975" Apr 22 18:37:18.242184 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:18.242145 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fljkh" podStartSLOduration=34.331182069 podStartE2EDuration="37.242132129s" podCreationTimestamp="2026-04-22 18:36:41 +0000 UTC" firstStartedPulling="2026-04-22 18:37:14.505159154 +0000 UTC m=+67.073660939" lastFinishedPulling="2026-04-22 18:37:17.416109201 +0000 UTC m=+69.984610999" observedRunningTime="2026-04-22 18:37:18.242122623 +0000 UTC m=+70.810624431" watchObservedRunningTime="2026-04-22 18:37:18.242132129 +0000 UTC m=+70.810633932" Apr 22 18:37:18.260079 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:18.260036 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" podStartSLOduration=24.517418216 podStartE2EDuration="37.260023659s" podCreationTimestamp="2026-04-22 18:36:41 +0000 UTC" firstStartedPulling="2026-04-22 18:36:41.965601882 +0000 UTC m=+34.534103673" lastFinishedPulling="2026-04-22 18:36:54.708207322 +0000 UTC m=+47.276709116" observedRunningTime="2026-04-22 18:37:18.259564406 +0000 UTC m=+70.828066215" watchObservedRunningTime="2026-04-22 18:37:18.260023659 +0000 UTC m=+70.828525468" Apr 22 18:37:18.287777 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:18.287757 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-vllgk" Apr 22 18:37:20.568136 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.568102 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hv22f"] Apr 22 18:37:20.571357 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.571334 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.573579 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.573546 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7qsn6\"" Apr 22 18:37:20.574110 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.574093 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:37:20.574110 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.574100 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:37:20.591198 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.591176 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hv22f"] Apr 22 18:37:20.595816 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.595797 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c7911d2a-687f-4929-8208-8de15cc2e64b-data-volume\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.595923 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.595843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8ptv\" (UniqueName: \"kubernetes.io/projected/c7911d2a-687f-4929-8208-8de15cc2e64b-kube-api-access-j8ptv\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.595923 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.595875 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c7911d2a-687f-4929-8208-8de15cc2e64b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.595923 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.595892 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c7911d2a-687f-4929-8208-8de15cc2e64b-crio-socket\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.595923 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.595919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c7911d2a-687f-4929-8208-8de15cc2e64b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.696446 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.696425 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8ptv\" (UniqueName: \"kubernetes.io/projected/c7911d2a-687f-4929-8208-8de15cc2e64b-kube-api-access-j8ptv\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.696544 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.696494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c7911d2a-687f-4929-8208-8de15cc2e64b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.696544 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.696524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c7911d2a-687f-4929-8208-8de15cc2e64b-crio-socket\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.696663 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.696568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c7911d2a-687f-4929-8208-8de15cc2e64b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.696663 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.696616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c7911d2a-687f-4929-8208-8de15cc2e64b-data-volume\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.696756 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.696660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c7911d2a-687f-4929-8208-8de15cc2e64b-crio-socket\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.696941 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.696921 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c7911d2a-687f-4929-8208-8de15cc2e64b-data-volume\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.697065 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.697048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c7911d2a-687f-4929-8208-8de15cc2e64b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.698621 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.698603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c7911d2a-687f-4929-8208-8de15cc2e64b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.706811 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.706793 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8ptv\" (UniqueName: \"kubernetes.io/projected/c7911d2a-687f-4929-8208-8de15cc2e64b-kube-api-access-j8ptv\") pod \"insights-runtime-extractor-hv22f\" (UID: \"c7911d2a-687f-4929-8208-8de15cc2e64b\") " pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:20.884280 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:20.884248 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hv22f" Apr 22 18:37:21.003648 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:21.003621 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hv22f"] Apr 22 18:37:21.006394 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:37:21.006370 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7911d2a_687f_4929_8208_8de15cc2e64b.slice/crio-82e962ecdfe023e62677558508c5b48118aa46433f055b53396cd2175a9a122d WatchSource:0}: Error finding container 82e962ecdfe023e62677558508c5b48118aa46433f055b53396cd2175a9a122d: Status 404 returned error can't find the container with id 82e962ecdfe023e62677558508c5b48118aa46433f055b53396cd2175a9a122d Apr 22 18:37:21.220345 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:21.220306 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hv22f" event={"ID":"c7911d2a-687f-4929-8208-8de15cc2e64b","Type":"ContainerStarted","Data":"1264b255850ca8c80d1efa5f82d00cd61ccdd73640ca979453660000bf6e7ec1"} Apr 22 18:37:21.220345 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:21.220350 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hv22f" event={"ID":"c7911d2a-687f-4929-8208-8de15cc2e64b","Type":"ContainerStarted","Data":"82e962ecdfe023e62677558508c5b48118aa46433f055b53396cd2175a9a122d"} Apr 22 18:37:22.224374 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:22.224342 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hv22f" event={"ID":"c7911d2a-687f-4929-8208-8de15cc2e64b","Type":"ContainerStarted","Data":"477ab7f990ef71666c7c2e588a9aa85e3d4459bf77f39274a86e39e4f0ffbca6"} Apr 22 18:37:24.231999 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:24.231914 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hv22f" event={"ID":"c7911d2a-687f-4929-8208-8de15cc2e64b","Type":"ContainerStarted","Data":"91e098e8ce1d51e2f840efa41418b173c6a624351903c6413b12e9218ae69b1d"} Apr 22 18:37:24.256464 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:24.256422 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hv22f" podStartSLOduration=1.398289804 podStartE2EDuration="4.256407836s" podCreationTimestamp="2026-04-22 18:37:20 +0000 UTC" firstStartedPulling="2026-04-22 18:37:21.059959053 +0000 UTC m=+73.628460839" lastFinishedPulling="2026-04-22 18:37:23.918077086 +0000 UTC m=+76.486578871" observedRunningTime="2026-04-22 18:37:24.255472859 +0000 UTC m=+76.823974666" watchObservedRunningTime="2026-04-22 18:37:24.256407836 +0000 UTC m=+76.824909643" Apr 22 18:37:25.200184 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:25.200135 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vc87s" Apr 22 18:37:26.123641 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:26.123608 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-p756f" Apr 22 18:37:31.485893 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:31.485854 2577 patch_prober.go:28] interesting pod/image-registry-56667fd489-p9kvx container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:37:31.486283 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:31.485937 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" podUID="171e80c5-873b-40fc-b152-1a96147b240e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:37:32.254659 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:32.254635 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 18:37:32.254779 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:32.254680 2577 generic.go:358] "Generic (PLEG): container finished" podID="d89211d4-cc09-4e34-bb17-86aaf93fef39" containerID="bb935bc01d7d6ab8159d396c6a60d975fb16834cc465dc4c4c2bbd08fee18cb5" exitCode=2 Apr 22 18:37:32.254779 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:32.254750 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" event={"ID":"d89211d4-cc09-4e34-bb17-86aaf93fef39","Type":"ContainerDied","Data":"bb935bc01d7d6ab8159d396c6a60d975fb16834cc465dc4c4c2bbd08fee18cb5"} Apr 22 18:37:32.255074 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:32.255061 2577 scope.go:117] "RemoveContainer" containerID="bb935bc01d7d6ab8159d396c6a60d975fb16834cc465dc4c4c2bbd08fee18cb5" Apr 22 18:37:33.182736 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:33.182709 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:37:33.259920 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:33.259894 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 18:37:33.260062 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:33.259977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dbp2h" event={"ID":"d89211d4-cc09-4e34-bb17-86aaf93fef39","Type":"ContainerStarted","Data":"6b6eec3b660205565341cb3db80b0c7da8538ebfc5c083a35db32346bc2c149e"} Apr 22 18:37:36.157809 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.157777 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zvd89"] Apr 22 18:37:36.162258 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.162225 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.164504 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.164482 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:37:36.169611 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.165892 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:37:36.169611 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.166064 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:37:36.169611 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.166516 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:37:36.169611 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.167095 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6pfl9\"" Apr 22 18:37:36.206949 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.206926 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-textfile\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.207055 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.206961 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9t5b\" (UniqueName: \"kubernetes.io/projected/194baeb1-89ff-49f3-86a1-d288eb5c87f7-kube-api-access-h9t5b\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.207055 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.206979 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/194baeb1-89ff-49f3-86a1-d288eb5c87f7-metrics-client-ca\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.207055 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.207037 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-accelerators-collector-config\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.207181 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.207060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-wtmp\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.207181 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.207082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/194baeb1-89ff-49f3-86a1-d288eb5c87f7-root\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.207181 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.207097 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/194baeb1-89ff-49f3-86a1-d288eb5c87f7-sys\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.207181 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.207116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-tls\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.207181 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.207132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307579 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307554 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-accelerators-collector-config\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307687 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307583 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-wtmp\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307687 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/194baeb1-89ff-49f3-86a1-d288eb5c87f7-root\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307687 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/194baeb1-89ff-49f3-86a1-d288eb5c87f7-sys\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307687 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-tls\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307687 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307943 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307693 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/194baeb1-89ff-49f3-86a1-d288eb5c87f7-root\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307943 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-textfile\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307943 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/194baeb1-89ff-49f3-86a1-d288eb5c87f7-sys\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307943 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-wtmp\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307943 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9t5b\" (UniqueName: \"kubernetes.io/projected/194baeb1-89ff-49f3-86a1-d288eb5c87f7-kube-api-access-h9t5b\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307943 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:37:36.307781 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:37:36.307943 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/194baeb1-89ff-49f3-86a1-d288eb5c87f7-metrics-client-ca\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.307943 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:37:36.307850 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-tls podName:194baeb1-89ff-49f3-86a1-d288eb5c87f7 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:36.807829853 +0000 UTC m=+89.376331655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-tls") pod "node-exporter-zvd89" (UID: "194baeb1-89ff-49f3-86a1-d288eb5c87f7") : secret "node-exporter-tls" not found Apr 22 18:37:36.308346 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.307993 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-textfile\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.308346 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.308208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-accelerators-collector-config\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.308346 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.308266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/194baeb1-89ff-49f3-86a1-d288eb5c87f7-metrics-client-ca\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.309864 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.309847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.325137 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.325116 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9t5b\" (UniqueName: \"kubernetes.io/projected/194baeb1-89ff-49f3-86a1-d288eb5c87f7-kube-api-access-h9t5b\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.810722 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:36.810689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-tls\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:36.810912 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:37:36.810863 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:37:36.810979 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:37:36.810935 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-tls podName:194baeb1-89ff-49f3-86a1-d288eb5c87f7 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:37.81091675 +0000 UTC m=+90.379418554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-tls") pod "node-exporter-zvd89" (UID: "194baeb1-89ff-49f3-86a1-d288eb5c87f7") : secret "node-exporter-tls" not found Apr 22 18:37:37.818852 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:37.818817 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-tls\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:37.821030 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:37.821010 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/194baeb1-89ff-49f3-86a1-d288eb5c87f7-node-exporter-tls\") pod \"node-exporter-zvd89\" (UID: \"194baeb1-89ff-49f3-86a1-d288eb5c87f7\") " pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:37.974695 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:37.974662 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zvd89" Apr 22 18:37:37.984352 ip-10-0-133-29 kubenswrapper[2577]: W0422 18:37:37.984328 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod194baeb1_89ff_49f3_86a1_d288eb5c87f7.slice/crio-9ba84762e30271006c3e3c38e442e565258f6a0d3ce4ec00ce027675b956d3c6 WatchSource:0}: Error finding container 9ba84762e30271006c3e3c38e442e565258f6a0d3ce4ec00ce027675b956d3c6: Status 404 returned error can't find the container with id 9ba84762e30271006c3e3c38e442e565258f6a0d3ce4ec00ce027675b956d3c6 Apr 22 18:37:38.274889 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:38.274855 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvd89" event={"ID":"194baeb1-89ff-49f3-86a1-d288eb5c87f7","Type":"ContainerStarted","Data":"9ba84762e30271006c3e3c38e442e565258f6a0d3ce4ec00ce027675b956d3c6"} Apr 22 18:37:39.279469 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:39.279438 2577 generic.go:358] "Generic (PLEG): container finished" podID="194baeb1-89ff-49f3-86a1-d288eb5c87f7" containerID="3ec2e621fb03cf7cd68b4def7a4d912d50f8fb37cef23f87191c9b66fe85ebef" exitCode=0 Apr 22 18:37:39.279832 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:39.279478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvd89" event={"ID":"194baeb1-89ff-49f3-86a1-d288eb5c87f7","Type":"ContainerDied","Data":"3ec2e621fb03cf7cd68b4def7a4d912d50f8fb37cef23f87191c9b66fe85ebef"} Apr 22 18:37:40.284378 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:40.284338 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvd89" event={"ID":"194baeb1-89ff-49f3-86a1-d288eb5c87f7","Type":"ContainerStarted","Data":"ade4fff018eb559bfec0e8c6c87e362a65f5b9e54ac33fa646485ce486260805"} Apr 22 18:37:40.284378 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:40.284378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvd89" event={"ID":"194baeb1-89ff-49f3-86a1-d288eb5c87f7","Type":"ContainerStarted","Data":"2af9518e0f15e90d1a085bdf0c163a845187347fc7812c4a0173369045850077"} Apr 22 18:37:40.308113 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:40.308070 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zvd89" podStartSLOduration=3.49204727 podStartE2EDuration="4.308057565s" podCreationTimestamp="2026-04-22 18:37:36 +0000 UTC" firstStartedPulling="2026-04-22 18:37:37.986363789 +0000 UTC m=+90.554865591" lastFinishedPulling="2026-04-22 18:37:38.802374101 +0000 UTC m=+91.370875886" observedRunningTime="2026-04-22 18:37:40.30716788 +0000 UTC m=+92.875669698" watchObservedRunningTime="2026-04-22 18:37:40.308057565 +0000 UTC m=+92.876559372" Apr 22 18:37:43.041981 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:37:43.041952 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56667fd489-p9kvx"] Apr 22 18:38:08.064196 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.064136 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" podUID="171e80c5-873b-40fc-b152-1a96147b240e" containerName="registry" containerID="cri-o://b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348" gracePeriod=30 Apr 22 18:38:08.300742 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.300721 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:38:08.360661 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.360584 2577 generic.go:358] "Generic (PLEG): container finished" podID="171e80c5-873b-40fc-b152-1a96147b240e" containerID="b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348" exitCode=0 Apr 22 18:38:08.360661 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.360642 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" Apr 22 18:38:08.360661 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.360659 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" event={"ID":"171e80c5-873b-40fc-b152-1a96147b240e","Type":"ContainerDied","Data":"b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348"} Apr 22 18:38:08.360874 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.360689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56667fd489-p9kvx" event={"ID":"171e80c5-873b-40fc-b152-1a96147b240e","Type":"ContainerDied","Data":"c446098cf47a098cd24cb4503c504c388c77c9f5801998580f5b2fcaccd4a592"} Apr 22 18:38:08.360874 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.360708 2577 scope.go:117] "RemoveContainer" containerID="b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348" Apr 22 18:38:08.368919 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.368904 2577 scope.go:117] "RemoveContainer" containerID="b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348" Apr 22 18:38:08.369183 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:38:08.369161 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348\": container with ID starting with b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348 not found: ID does not exist" containerID="b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348" Apr 22 18:38:08.369259 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.369188 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348"} err="failed to get container status \"b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348\": rpc error: code = NotFound desc = could not find container \"b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348\": container with ID starting with b9ba00996f4dd758f0549e46916c6052f8f72acce691e6e9763e0e3e1c433348 not found: ID does not exist" Apr 22 18:38:08.458569 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.458546 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/171e80c5-873b-40fc-b152-1a96147b240e-ca-trust-extracted\") pod \"171e80c5-873b-40fc-b152-1a96147b240e\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " Apr 22 18:38:08.458668 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.458587 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-trusted-ca\") pod \"171e80c5-873b-40fc-b152-1a96147b240e\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " Apr 22 18:38:08.458668 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.458624 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-installation-pull-secrets\") pod \"171e80c5-873b-40fc-b152-1a96147b240e\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " Apr 22 18:38:08.458742 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.458687 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-registry-certificates\") pod \"171e80c5-873b-40fc-b152-1a96147b240e\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " Apr 22 18:38:08.458829 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.458809 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-bound-sa-token\") pod \"171e80c5-873b-40fc-b152-1a96147b240e\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " Apr 22 18:38:08.458868 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.458854 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-image-registry-private-configuration\") pod \"171e80c5-873b-40fc-b152-1a96147b240e\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " Apr 22 18:38:08.458912 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.458889 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw6l5\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-kube-api-access-zw6l5\") pod \"171e80c5-873b-40fc-b152-1a96147b240e\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " Apr 22 18:38:08.458949 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.458919 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") pod \"171e80c5-873b-40fc-b152-1a96147b240e\" (UID: \"171e80c5-873b-40fc-b152-1a96147b240e\") " Apr 22 18:38:08.459064 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.459039 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "171e80c5-873b-40fc-b152-1a96147b240e" (UID: "171e80c5-873b-40fc-b152-1a96147b240e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:38:08.459408 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.459389 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-trusted-ca\") on node \"ip-10-0-133-29.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.459509 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.459421 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "171e80c5-873b-40fc-b152-1a96147b240e" (UID: "171e80c5-873b-40fc-b152-1a96147b240e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:38:08.461345 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.461318 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "171e80c5-873b-40fc-b152-1a96147b240e" (UID: "171e80c5-873b-40fc-b152-1a96147b240e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:38:08.461443 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.461404 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "171e80c5-873b-40fc-b152-1a96147b240e" (UID: "171e80c5-873b-40fc-b152-1a96147b240e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:38:08.461570 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.461546 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "171e80c5-873b-40fc-b152-1a96147b240e" (UID: "171e80c5-873b-40fc-b152-1a96147b240e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:38:08.461643 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.461573 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "171e80c5-873b-40fc-b152-1a96147b240e" (UID: "171e80c5-873b-40fc-b152-1a96147b240e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:38:08.461643 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.461604 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-kube-api-access-zw6l5" (OuterVolumeSpecName: "kube-api-access-zw6l5") pod "171e80c5-873b-40fc-b152-1a96147b240e" (UID: "171e80c5-873b-40fc-b152-1a96147b240e"). InnerVolumeSpecName "kube-api-access-zw6l5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:38:08.467711 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.467689 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/171e80c5-873b-40fc-b152-1a96147b240e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "171e80c5-873b-40fc-b152-1a96147b240e" (UID: "171e80c5-873b-40fc-b152-1a96147b240e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:38:08.560480 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.560439 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-installation-pull-secrets\") on node \"ip-10-0-133-29.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.560480 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.560478 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/171e80c5-873b-40fc-b152-1a96147b240e-registry-certificates\") on node \"ip-10-0-133-29.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.560647 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.560494 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-bound-sa-token\") on node \"ip-10-0-133-29.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.560647 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.560508 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/171e80c5-873b-40fc-b152-1a96147b240e-image-registry-private-configuration\") on node \"ip-10-0-133-29.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.560647 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.560521 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zw6l5\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-kube-api-access-zw6l5\") on node \"ip-10-0-133-29.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.560647 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.560533 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/171e80c5-873b-40fc-b152-1a96147b240e-registry-tls\") on node \"ip-10-0-133-29.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.560647 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.560545 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/171e80c5-873b-40fc-b152-1a96147b240e-ca-trust-extracted\") on node \"ip-10-0-133-29.ec2.internal\" DevicePath \"\"" Apr 22 18:38:08.691853 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.691823 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56667fd489-p9kvx"] Apr 22 18:38:08.698179 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:08.698153 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-56667fd489-p9kvx"] Apr 22 18:38:09.905648 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:09.905613 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="171e80c5-873b-40fc-b152-1a96147b240e" path="/var/lib/kubelet/pods/171e80c5-873b-40fc-b152-1a96147b240e/volumes" Apr 22 18:38:15.777024 ip-10-0-133-29 kubenswrapper[2577]: E0422 18:38:15.776998 2577 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79c52d40_7d6b_4114_9d7d_80fa7fd19bf2.slice/crio-3bb3634f7ef385c28c9e42070f9a4c40c1dbcd2628391ce8513414b4ab59a55d.scope\": RecentStats: unable to find data in memory cache]" Apr 22 18:38:16.384562 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:16.384479 2577 generic.go:358] "Generic (PLEG): container finished" podID="79c52d40-7d6b-4114-9d7d-80fa7fd19bf2" containerID="3bb3634f7ef385c28c9e42070f9a4c40c1dbcd2628391ce8513414b4ab59a55d" exitCode=0 Apr 22 18:38:16.384719 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:16.384563 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dxw7s" event={"ID":"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2","Type":"ContainerDied","Data":"3bb3634f7ef385c28c9e42070f9a4c40c1dbcd2628391ce8513414b4ab59a55d"} Apr 22 18:38:16.384930 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:16.384910 2577 scope.go:117] "RemoveContainer" containerID="3bb3634f7ef385c28c9e42070f9a4c40c1dbcd2628391ce8513414b4ab59a55d" Apr 22 18:38:16.385918 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:16.385893 2577 generic.go:358] "Generic (PLEG): container finished" podID="77f0cc31-e890-43ad-a890-116520afc036" containerID="33272609056c699e29136cb9d425e6c36b55c069e1086151588df1b58362f937" exitCode=0 Apr 22 18:38:16.386019 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:16.385931 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" event={"ID":"77f0cc31-e890-43ad-a890-116520afc036","Type":"ContainerDied","Data":"33272609056c699e29136cb9d425e6c36b55c069e1086151588df1b58362f937"} Apr 22 18:38:16.386193 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:16.386179 2577 scope.go:117] "RemoveContainer" containerID="33272609056c699e29136cb9d425e6c36b55c069e1086151588df1b58362f937" Apr 22 18:38:17.390326 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:17.390289 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-tkzvq" event={"ID":"77f0cc31-e890-43ad-a890-116520afc036","Type":"ContainerStarted","Data":"9cbdf907738c256bdee9c56ee5ee85ceeee7d4e27681cbf881556b02fd88ddb4"} Apr 22 18:38:17.391921 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:17.391899 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-dxw7s" event={"ID":"79c52d40-7d6b-4114-9d7d-80fa7fd19bf2","Type":"ContainerStarted","Data":"cc4a6d504923d8dffd72c1b87009998400e789b31c3ed136ee49fdfacea329f4"} Apr 22 18:38:19.637164 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:19.637128 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" podUID="8a55aad3-c17e-4873-8971-af9571fa6014" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:38:21.404411 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:21.404373 2577 generic.go:358] "Generic (PLEG): container finished" podID="88a79e60-2ccb-4f68-bbaf-9ae07317bdf1" containerID="22f3b830929153400af89c61d2f6f99cfb31b91728e7865eba9269ddaca62d58" exitCode=0 Apr 22 18:38:21.404959 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:21.404444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" event={"ID":"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1","Type":"ContainerDied","Data":"22f3b830929153400af89c61d2f6f99cfb31b91728e7865eba9269ddaca62d58"} Apr 22 18:38:21.404959 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:21.404802 2577 scope.go:117] "RemoveContainer" containerID="22f3b830929153400af89c61d2f6f99cfb31b91728e7865eba9269ddaca62d58" Apr 22 18:38:22.408242 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:22.408183 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-tt2kq" event={"ID":"88a79e60-2ccb-4f68-bbaf-9ae07317bdf1","Type":"ContainerStarted","Data":"f11877cd518ab86b276cf59cf84baf119d346024b145adb8052f0b569312b15f"} Apr 22 18:38:29.638000 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:29.637953 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" podUID="8a55aad3-c17e-4873-8971-af9571fa6014" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:38:39.637382 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:39.637338 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" podUID="8a55aad3-c17e-4873-8971-af9571fa6014" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:38:39.637775 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:39.637413 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" Apr 22 18:38:39.637918 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:39.637900 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"273a9e47083486417cb9cbe774602a2231119a97f34007746b1f83c3f8d209c9"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 18:38:39.637954 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:39.637942 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" podUID="8a55aad3-c17e-4873-8971-af9571fa6014" containerName="service-proxy" containerID="cri-o://273a9e47083486417cb9cbe774602a2231119a97f34007746b1f83c3f8d209c9" gracePeriod=30 Apr 22 18:38:40.463464 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:40.463431 2577 generic.go:358] "Generic (PLEG): container finished" podID="8a55aad3-c17e-4873-8971-af9571fa6014" containerID="273a9e47083486417cb9cbe774602a2231119a97f34007746b1f83c3f8d209c9" exitCode=2 Apr 22 18:38:40.463637 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:40.463486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" event={"ID":"8a55aad3-c17e-4873-8971-af9571fa6014","Type":"ContainerDied","Data":"273a9e47083486417cb9cbe774602a2231119a97f34007746b1f83c3f8d209c9"} Apr 22 18:38:40.463637 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:38:40.463521 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-85c66c68c9-p2xdr" event={"ID":"8a55aad3-c17e-4873-8971-af9571fa6014","Type":"ContainerStarted","Data":"61e9247d95ff70b4acd1655b15ec53ac701f49916e7cb0779fe05a0e7ddda0a5"} Apr 22 18:41:07.847107 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:41:07.847078 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 18:41:07.847616 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:41:07.847158 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 18:41:07.850512 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:41:07.850491 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 18:41:07.850753 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:41:07.850733 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 18:41:07.860652 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:41:07.860633 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:46:07.867016 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:46:07.866990 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 18:46:07.868108 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:46:07.868085 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 18:46:07.870845 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:46:07.870825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 18:46:07.871799 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:46:07.871778 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 18:51:07.890054 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:51:07.886171 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 18:51:07.893311 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:51:07.893286 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 18:51:07.894433 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:51:07.894412 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 18:51:07.896760 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:51:07.896743 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 18:56:07.910220 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:56:07.910193 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 18:56:07.914000 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:56:07.913980 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 18:56:07.914397 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:56:07.914375 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 18:56:07.917972 ip-10-0-133-29 kubenswrapper[2577]: I0422 18:56:07.917954 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:01:07.928865 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:01:07.928761 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:01:07.932717 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:01:07.932572 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:01:07.932994 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:01:07.932965 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:01:07.936534 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:01:07.936513 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:06:07.946455 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:06:07.946349 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:06:07.953400 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:06:07.950102 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:06:07.953400 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:06:07.951311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:06:07.954701 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:06:07.954683 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:11:07.965114 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:11:07.965006 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:11:07.969095 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:11:07.968709 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:11:07.969571 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:11:07.969553 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:11:07.973192 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:11:07.973170 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:16:07.983862 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:16:07.983760 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:16:07.987848 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:16:07.987831 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:16:07.988718 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:16:07.988701 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:16:07.992318 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:16:07.992293 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:21:08.003194 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:21:08.003082 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:21:08.007208 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:21:08.006870 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:21:08.008454 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:21:08.008433 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:21:08.011997 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:21:08.011977 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:26:08.022164 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:26:08.022130 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:26:08.026042 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:26:08.026018 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:26:08.027076 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:26:08.027057 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:26:08.036044 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:26:08.036022 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:31:08.048519 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:31:08.048423 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:31:08.052384 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:31:08.052364 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:31:08.053918 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:31:08.053900 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:31:08.057410 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:31:08.057390 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:36:08.067328 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:36:08.067199 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:36:08.070920 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:36:08.070898 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:36:08.075199 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:36:08.075180 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:36:08.078500 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:36:08.078486 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:41:08.085379 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:41:08.085275 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:41:08.089414 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:41:08.088953 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:41:08.093570 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:41:08.093554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:41:08.097284 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:41:08.097269 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:46:08.103801 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:46:08.103693 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:46:08.109795 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:46:08.109776 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:46:08.114749 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:46:08.114724 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:46:08.123099 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:46:08.123082 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:51:08.129624 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:51:08.129501 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:51:08.133474 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:51:08.133053 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:51:08.138352 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:51:08.138333 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:51:08.141882 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:51:08.141867 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:56:08.147904 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:56:08.147800 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:56:08.151777 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:56:08.151468 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 19:56:08.156868 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:56:08.156850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 19:56:08.159906 ip-10-0-133-29 kubenswrapper[2577]: I0422 19:56:08.159887 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 20:01:08.166326 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:01:08.166200 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 20:01:08.170348 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:01:08.169711 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 20:01:08.177739 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:01:08.177719 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 20:01:08.183822 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:01:08.183805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 20:06:08.184907 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:06:08.184796 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 20:06:08.189945 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:06:08.188272 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 20:06:08.198806 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:06:08.198789 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 20:06:08.202167 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:06:08.202153 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 20:08:10.147197 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.147161 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k2gsp/must-gather-9nxwm"] Apr 22 20:08:10.147798 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.147566 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="171e80c5-873b-40fc-b152-1a96147b240e" containerName="registry" Apr 22 20:08:10.147798 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.147585 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="171e80c5-873b-40fc-b152-1a96147b240e" containerName="registry" Apr 22 20:08:10.147798 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.147668 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="171e80c5-873b-40fc-b152-1a96147b240e" containerName="registry" Apr 22 20:08:10.150556 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.150535 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gsp/must-gather-9nxwm" Apr 22 20:08:10.152585 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.152561 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-k2gsp\"/\"default-dockercfg-dszbq\"" Apr 22 20:08:10.152693 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.152561 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k2gsp\"/\"openshift-service-ca.crt\"" Apr 22 20:08:10.152903 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.152885 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-k2gsp\"/\"kube-root-ca.crt\"" Apr 22 20:08:10.157799 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.157771 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k2gsp/must-gather-9nxwm"] Apr 22 20:08:10.201115 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.201092 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psz5t\" (UniqueName: \"kubernetes.io/projected/932161d4-07ab-46c4-8e89-93b4a5566599-kube-api-access-psz5t\") pod \"must-gather-9nxwm\" (UID: \"932161d4-07ab-46c4-8e89-93b4a5566599\") " pod="openshift-must-gather-k2gsp/must-gather-9nxwm" Apr 22 20:08:10.201245 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.201132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/932161d4-07ab-46c4-8e89-93b4a5566599-must-gather-output\") pod \"must-gather-9nxwm\" (UID: \"932161d4-07ab-46c4-8e89-93b4a5566599\") " pod="openshift-must-gather-k2gsp/must-gather-9nxwm" Apr 22 20:08:10.302115 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.302088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psz5t\" (UniqueName: \"kubernetes.io/projected/932161d4-07ab-46c4-8e89-93b4a5566599-kube-api-access-psz5t\") pod \"must-gather-9nxwm\" (UID: \"932161d4-07ab-46c4-8e89-93b4a5566599\") " pod="openshift-must-gather-k2gsp/must-gather-9nxwm" Apr 22 20:08:10.302250 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.302125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/932161d4-07ab-46c4-8e89-93b4a5566599-must-gather-output\") pod \"must-gather-9nxwm\" (UID: \"932161d4-07ab-46c4-8e89-93b4a5566599\") " pod="openshift-must-gather-k2gsp/must-gather-9nxwm" Apr 22 20:08:10.302512 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.302495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/932161d4-07ab-46c4-8e89-93b4a5566599-must-gather-output\") pod \"must-gather-9nxwm\" (UID: \"932161d4-07ab-46c4-8e89-93b4a5566599\") " pod="openshift-must-gather-k2gsp/must-gather-9nxwm" Apr 22 20:08:10.310219 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.310195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psz5t\" (UniqueName: \"kubernetes.io/projected/932161d4-07ab-46c4-8e89-93b4a5566599-kube-api-access-psz5t\") pod \"must-gather-9nxwm\" (UID: \"932161d4-07ab-46c4-8e89-93b4a5566599\") " pod="openshift-must-gather-k2gsp/must-gather-9nxwm" Apr 22 20:08:10.460279 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.460186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gsp/must-gather-9nxwm" Apr 22 20:08:10.576154 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.576123 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k2gsp/must-gather-9nxwm"] Apr 22 20:08:10.579126 ip-10-0-133-29 kubenswrapper[2577]: W0422 20:08:10.579099 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod932161d4_07ab_46c4_8e89_93b4a5566599.slice/crio-5b726c9e62332376d1b9cd727e314be7b32546349f7c345f519ec1a6aeefba30 WatchSource:0}: Error finding container 5b726c9e62332376d1b9cd727e314be7b32546349f7c345f519ec1a6aeefba30: Status 404 returned error can't find the container with id 5b726c9e62332376d1b9cd727e314be7b32546349f7c345f519ec1a6aeefba30 Apr 22 20:08:10.580939 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.580923 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:08:10.770872 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:10.770839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gsp/must-gather-9nxwm" event={"ID":"932161d4-07ab-46c4-8e89-93b4a5566599","Type":"ContainerStarted","Data":"5b726c9e62332376d1b9cd727e314be7b32546349f7c345f519ec1a6aeefba30"} Apr 22 20:08:11.776992 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:11.776903 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gsp/must-gather-9nxwm" event={"ID":"932161d4-07ab-46c4-8e89-93b4a5566599","Type":"ContainerStarted","Data":"ce68577081d58a3a2256cb7d98c2cc7ee1334bed1b33be02873cc839a8e0339f"} Apr 22 20:08:12.782511 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:12.782474 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gsp/must-gather-9nxwm" event={"ID":"932161d4-07ab-46c4-8e89-93b4a5566599","Type":"ContainerStarted","Data":"c5c1ff73ad1fcb3a38a3acc940c900c698c003bc726d67110aa00631f12dc7c0"} Apr 22 20:08:12.798805 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:12.798758 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k2gsp/must-gather-9nxwm" podStartSLOduration=1.850298934 podStartE2EDuration="2.798742625s" podCreationTimestamp="2026-04-22 20:08:10 +0000 UTC" firstStartedPulling="2026-04-22 20:08:10.581072757 +0000 UTC m=+5523.149574542" lastFinishedPulling="2026-04-22 20:08:11.529516441 +0000 UTC m=+5524.098018233" observedRunningTime="2026-04-22 20:08:12.797172237 +0000 UTC m=+5525.365674042" watchObservedRunningTime="2026-04-22 20:08:12.798742625 +0000 UTC m=+5525.367244432" Apr 22 20:08:13.044513 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:13.044441 2577 ???:1] "http: TLS handshake error from 10.0.133.29:33654: EOF" Apr 22 20:08:13.062605 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:13.062579 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fbr4l_b3a1c2eb-91b0-417e-818b-e08b94eca20e/global-pull-secret-syncer/0.log" Apr 22 20:08:13.127844 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:13.127811 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pjhsl_d6ae7655-3652-4c40-a767-81b41b44d742/konnectivity-agent/0.log" Apr 22 20:08:13.330018 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:13.329949 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-29.ec2.internal_6c73ad6357ff9dfe087208dfa7eeace3/haproxy/0.log" Apr 22 20:08:16.678838 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:16.678760 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/1.log" Apr 22 20:08:16.809968 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:16.809916 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dbp2h_d89211d4-cc09-4e34-bb17-86aaf93fef39/cluster-monitoring-operator/0.log" Apr 22 20:08:17.176804 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:17.176775 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvd89_194baeb1-89ff-49f3-86a1-d288eb5c87f7/node-exporter/0.log" Apr 22 20:08:17.203252 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:17.203193 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvd89_194baeb1-89ff-49f3-86a1-d288eb5c87f7/kube-rbac-proxy/0.log" Apr 22 20:08:17.229761 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:17.229735 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvd89_194baeb1-89ff-49f3-86a1-d288eb5c87f7/init-textfile/0.log" Apr 22 20:08:19.350636 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:19.350596 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/1.log" Apr 22 20:08:19.360028 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:19.359994 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vllgk_5e96d5c9-534c-4e08-b6a6-5f20b407b3e3/console-operator/2.log" Apr 22 20:08:19.865588 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:19.865522 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm"] Apr 22 20:08:19.869121 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:19.869098 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:19.875189 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:19.875165 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm"] Apr 22 20:08:19.988831 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:19.988800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-proc\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:19.988831 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:19.988844 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q2zb\" (UniqueName: \"kubernetes.io/projected/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-kube-api-access-2q2zb\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:19.989029 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:19.988872 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-podres\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:19.989029 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:19.988918 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-sys\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:19.989029 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:19.989001 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-lib-modules\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.089402 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.089366 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-sys\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.089567 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.089419 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-lib-modules\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.089567 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.089453 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-proc\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.089567 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.089480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2q2zb\" (UniqueName: \"kubernetes.io/projected/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-kube-api-access-2q2zb\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.089567 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.089509 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-sys\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.089734 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.089571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-podres\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.089734 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.089584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-proc\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.089734 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.089619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-lib-modules\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.089734 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.089645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-podres\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.097500 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.097480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q2zb\" (UniqueName: \"kubernetes.io/projected/d02f26d8-c4cf-4af3-a9f5-2f84406941d9-kube-api-access-2q2zb\") pod \"perf-node-gather-daemonset-twqpm\" (UID: \"d02f26d8-c4cf-4af3-a9f5-2f84406941d9\") " pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.179892 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.179862 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.184786 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.184757 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-rmv4w_17abeeb9-50cd-4f77-9ee1-e387268f4e5f/volume-data-source-validator/0.log" Apr 22 20:08:20.317783 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.317751 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm"] Apr 22 20:08:20.320859 ip-10-0-133-29 kubenswrapper[2577]: W0422 20:08:20.320828 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd02f26d8_c4cf_4af3_a9f5_2f84406941d9.slice/crio-564bff332dfec6f18cd148b6d402f112b2531195bd930a6154e231bd80817135 WatchSource:0}: Error finding container 564bff332dfec6f18cd148b6d402f112b2531195bd930a6154e231bd80817135: Status 404 returned error can't find the container with id 564bff332dfec6f18cd148b6d402f112b2531195bd930a6154e231bd80817135 Apr 22 20:08:20.820005 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.819973 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" event={"ID":"d02f26d8-c4cf-4af3-a9f5-2f84406941d9","Type":"ContainerStarted","Data":"2989116402ff6c3403f713599cd3d4f284b140d1deb89d343edac722107274e8"} Apr 22 20:08:20.820005 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.820010 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" event={"ID":"d02f26d8-c4cf-4af3-a9f5-2f84406941d9","Type":"ContainerStarted","Data":"564bff332dfec6f18cd148b6d402f112b2531195bd930a6154e231bd80817135"} Apr 22 20:08:20.820509 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.820102 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:20.835977 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.835935 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" podStartSLOduration=1.83592058 podStartE2EDuration="1.83592058s" podCreationTimestamp="2026-04-22 20:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:08:20.834450326 +0000 UTC m=+5533.402952133" watchObservedRunningTime="2026-04-22 20:08:20.83592058 +0000 UTC m=+5533.404422384" Apr 22 20:08:20.960443 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.960416 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vc87s_25870685-4492-459c-a0e2-82a05d74127a/dns/0.log" Apr 22 20:08:20.982378 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:20.982353 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vc87s_25870685-4492-459c-a0e2-82a05d74127a/kube-rbac-proxy/0.log" Apr 22 20:08:21.032686 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:21.032657 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sdrsv_71861e02-68a2-47be-ad90-6fd2d00d058b/dns-node-resolver/0.log" Apr 22 20:08:21.555893 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:21.555845 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wm9hb_938b317c-4e75-4cee-9219-836d71fde87b/node-ca/0.log" Apr 22 20:08:22.279063 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:22.279033 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7786869f5d-r5qgc_94003369-e42a-4e67-ab30-f6abbd37dcc7/router/0.log" Apr 22 20:08:22.674470 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:22.674378 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vtrxb_84a0c8c1-7bbb-41ec-86d3-1c379b8789ff/serve-healthcheck-canary/0.log" Apr 22 20:08:23.061129 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:23.061093 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-dxw7s_79c52d40-7d6b-4114-9d7d-80fa7fd19bf2/insights-operator/0.log" Apr 22 20:08:23.064095 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:23.064068 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-dxw7s_79c52d40-7d6b-4114-9d7d-80fa7fd19bf2/insights-operator/1.log" Apr 22 20:08:23.164408 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:23.164379 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hv22f_c7911d2a-687f-4929-8208-8de15cc2e64b/kube-rbac-proxy/0.log" Apr 22 20:08:23.186463 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:23.186422 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hv22f_c7911d2a-687f-4929-8208-8de15cc2e64b/exporter/0.log" Apr 22 20:08:23.212020 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:23.211977 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hv22f_c7911d2a-687f-4929-8208-8de15cc2e64b/extractor/0.log" Apr 22 20:08:26.834954 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:26.834924 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-k2gsp/perf-node-gather-daemonset-twqpm" Apr 22 20:08:29.733708 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:29.733628 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-tkzvq_77f0cc31-e890-43ad-a890-116520afc036/kube-storage-version-migrator-operator/1.log" Apr 22 20:08:29.735157 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:29.735114 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-tkzvq_77f0cc31-e890-43ad-a890-116520afc036/kube-storage-version-migrator-operator/0.log" Apr 22 20:08:30.733389 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:30.733357 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dqfsv_447d848d-6ef3-4b39-a91c-16579bc83c6d/kube-multus-additional-cni-plugins/0.log" Apr 22 20:08:30.760269 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:30.760240 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dqfsv_447d848d-6ef3-4b39-a91c-16579bc83c6d/egress-router-binary-copy/0.log" Apr 22 20:08:30.783638 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:30.783574 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dqfsv_447d848d-6ef3-4b39-a91c-16579bc83c6d/cni-plugins/0.log" Apr 22 20:08:30.806287 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:30.806265 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dqfsv_447d848d-6ef3-4b39-a91c-16579bc83c6d/bond-cni-plugin/0.log" Apr 22 20:08:30.827818 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:30.827790 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dqfsv_447d848d-6ef3-4b39-a91c-16579bc83c6d/routeoverride-cni/0.log" Apr 22 20:08:30.850024 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:30.849989 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dqfsv_447d848d-6ef3-4b39-a91c-16579bc83c6d/whereabouts-cni-bincopy/0.log" Apr 22 20:08:30.876815 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:30.876740 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dqfsv_447d848d-6ef3-4b39-a91c-16579bc83c6d/whereabouts-cni/0.log" Apr 22 20:08:31.454576 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:31.454543 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b7lx2_ff2fec6e-71e1-40b5-a159-2609e7db8ff5/kube-multus/0.log" Apr 22 20:08:31.708578 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:31.708504 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vqnz5_f20c356d-ebd8-4177-92c7-8bf2571249a2/network-metrics-daemon/0.log" Apr 22 20:08:31.734944 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:31.734918 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vqnz5_f20c356d-ebd8-4177-92c7-8bf2571249a2/kube-rbac-proxy/0.log" Apr 22 20:08:32.479187 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:32.479154 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pr85c_946fd2db-0b92-4961-b670-a53e33d7f40f/ovn-controller/0.log" Apr 22 20:08:32.550544 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:32.550508 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pr85c_946fd2db-0b92-4961-b670-a53e33d7f40f/ovn-acl-logging/0.log" Apr 22 20:08:32.572012 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:32.571987 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pr85c_946fd2db-0b92-4961-b670-a53e33d7f40f/kube-rbac-proxy-node/0.log" Apr 22 20:08:32.602354 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:32.602334 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pr85c_946fd2db-0b92-4961-b670-a53e33d7f40f/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:08:32.648061 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:32.648037 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pr85c_946fd2db-0b92-4961-b670-a53e33d7f40f/northd/0.log" Apr 22 20:08:32.677745 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:32.677718 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pr85c_946fd2db-0b92-4961-b670-a53e33d7f40f/nbdb/0.log" Apr 22 20:08:32.711290 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:32.711267 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pr85c_946fd2db-0b92-4961-b670-a53e33d7f40f/sbdb/0.log" Apr 22 20:08:32.881422 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:32.881394 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pr85c_946fd2db-0b92-4961-b670-a53e33d7f40f/ovnkube-controller/0.log" Apr 22 20:08:34.438531 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:34.438501 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-wpxbk_40336ed8-b40b-4340-9566-2d80600db3d6/check-endpoints/0.log" Apr 22 20:08:34.486862 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:34.486838 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-p756f_f632af1b-67e1-4b4d-9446-ea503297edd6/network-check-target-container/0.log" Apr 22 20:08:35.409410 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:35.409380 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-422g7_4f81323a-e699-4fb2-8971-b7fb4c18c2b3/iptables-alerter/0.log" Apr 22 20:08:36.110703 ip-10-0-133-29 kubenswrapper[2577]: I0422 20:08:36.110671 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6vvzm_63465ecd-302d-4a7f-b3d3-9b9cc341d995/tuned/0.log"