Apr 17 21:36:26.032383 ip-10-0-141-47 systemd[1]: Starting Kubernetes Kubelet... Apr 17 21:36:26.538060 ip-10-0-141-47 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:36:26.538060 ip-10-0-141-47 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 21:36:26.538060 ip-10-0-141-47 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:36:26.538060 ip-10-0-141-47 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 21:36:26.538060 ip-10-0-141-47 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:36:26.541805 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.541672 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 21:36:26.545020 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545005 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:36:26.545020 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545020 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545023 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545027 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545030 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545033 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545037 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545040 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545043 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545045 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545048 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545051 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545053 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545056 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545059 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545068 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545070 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545084 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545087 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545090 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545093 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:36:26.545094 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545095 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545103 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545106 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545109 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545111 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545114 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545117 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545120 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545122 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545125 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545127 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545130 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545133 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545135 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545138 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545141 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545143 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545146 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545148 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545151 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:36:26.545576 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545153 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545155 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545158 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545160 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545163 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545166 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545168 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545171 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545173 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545175 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545178 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545180 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545183 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545186 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545189 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545192 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545197 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545201 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545204 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545206 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:36:26.546106 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545209 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545211 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545232 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545236 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545240 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545243 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545246 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545248 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545251 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545255 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545258 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545263 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545267 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545269 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545272 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545275 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545278 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545281 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545284 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:36:26.546605 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545286 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545289 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545292 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545295 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545297 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545299 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545695 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545701 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545704 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545707 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545710 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545712 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545715 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545718 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545721 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545723 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545726 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545728 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545732 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:36:26.547060 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545737 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545739 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545742 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545745 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545747 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545750 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545753 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545756 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545758 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545761 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545763 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545766 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545768 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545771 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545773 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545776 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545779 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545781 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545784 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545786 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:36:26.547556 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545790 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545793 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545795 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545798 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545800 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545803 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545806 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545808 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545811 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545813 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545816 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545818 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545821 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545823 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545826 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545828 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545831 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545833 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545836 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545839 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:36:26.548055 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545842 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545844 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545847 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545849 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545852 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545854 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545856 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545859 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545861 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545864 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545867 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545869 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545872 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545875 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545877 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545880 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545883 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545886 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545888 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545891 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:36:26.548599 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545894 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545896 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545899 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545901 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545904 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545908 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545910 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545913 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545916 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545920 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545923 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545927 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.545930 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546004 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546019 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546027 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546032 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546037 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546040 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546045 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546050 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 21:36:26.549108 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546053 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546056 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546060 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546065 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546068 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546085 2576 flags.go:64] FLAG: --cgroup-root="" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546089 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546092 2576 flags.go:64] FLAG: --client-ca-file="" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546095 2576 flags.go:64] FLAG: --cloud-config="" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546098 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546101 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546105 2576 flags.go:64] FLAG: --cluster-domain="" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546108 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546112 2576 flags.go:64] FLAG: --config-dir="" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546115 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546119 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546123 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546126 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546130 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546133 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546136 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546139 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546142 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546146 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546149 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 21:36:26.549617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546154 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546157 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546160 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546163 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546166 2576 flags.go:64] FLAG: --enable-server="true" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546169 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546174 2576 flags.go:64] FLAG: --event-burst="100" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546177 2576 flags.go:64] FLAG: --event-qps="50" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546180 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546184 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546188 2576 flags.go:64] FLAG: --eviction-hard="" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546192 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546195 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546198 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546202 2576 flags.go:64] FLAG: --eviction-soft="" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546205 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546208 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546211 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546214 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546217 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546220 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546223 2576 flags.go:64] FLAG: --feature-gates="" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546227 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546230 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546233 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 21:36:26.550248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546236 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546240 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546243 2576 flags.go:64] FLAG: --help="false" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546246 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546250 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546253 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546257 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546261 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546265 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546268 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546271 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546274 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546277 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546280 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546283 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546286 2576 flags.go:64] FLAG: --kube-reserved="" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546289 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546292 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546295 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546298 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546301 2576 flags.go:64] FLAG: --lock-file="" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546304 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546307 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546310 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 21:36:26.550865 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546315 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546318 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546321 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546324 2576 flags.go:64] FLAG: --logging-format="text" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546327 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546331 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546334 2576 flags.go:64] FLAG: --manifest-url="" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546337 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546341 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546344 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546348 2576 flags.go:64] FLAG: --max-pods="110" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546352 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546355 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546358 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546361 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546364 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546367 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546370 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546378 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546381 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546384 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546387 2576 flags.go:64] FLAG: --pod-cidr="" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546390 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 21:36:26.551478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546396 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546399 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546403 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546406 2576 flags.go:64] FLAG: --port="10250" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546409 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546412 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-009768d35cf296d1b" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546415 2576 flags.go:64] FLAG: --qos-reserved="" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546418 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546422 2576 flags.go:64] FLAG: --register-node="true" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546425 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546428 2576 flags.go:64] FLAG: --register-with-taints="" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546431 2576 flags.go:64] FLAG: --registry-burst="10" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546434 2576 flags.go:64] FLAG: --registry-qps="5" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546437 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546442 2576 flags.go:64] FLAG: --reserved-memory="" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546446 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546450 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546453 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546456 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546459 2576 flags.go:64] FLAG: --runonce="false" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546462 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546465 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546468 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546471 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546474 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546477 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 21:36:26.552040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546481 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546484 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546487 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546490 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546493 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546495 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546498 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546501 2576 flags.go:64] FLAG: --system-cgroups="" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546504 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546510 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546513 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546516 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546520 2576 flags.go:64] FLAG: --tls-min-version="" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546523 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546526 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546529 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546532 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546535 2576 flags.go:64] FLAG: --v="2" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546539 2576 flags.go:64] FLAG: --version="false" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546543 2576 flags.go:64] FLAG: --vmodule="" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546549 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.546552 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546659 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546664 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:36:26.552679 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546667 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546670 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546673 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546675 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546678 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546680 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546683 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546685 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546694 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546696 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546699 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546701 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546704 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546707 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546710 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546712 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546715 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546718 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546720 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546724 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:36:26.553266 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546728 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546731 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546734 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546737 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546739 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546742 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546744 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546747 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546751 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546755 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546758 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546761 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546764 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546767 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546770 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546772 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546775 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546777 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546780 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:36:26.553793 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546782 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546785 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546793 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546796 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546798 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546808 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546811 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546814 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546816 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546819 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546821 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546824 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546827 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546829 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546832 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546834 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546837 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546839 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546842 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546844 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:36:26.554287 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546847 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546850 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546853 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546856 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546858 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546861 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546863 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546866 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546869 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546871 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546874 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546876 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546879 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546881 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546884 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546892 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546895 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546898 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546900 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546903 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:36:26.554773 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546905 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:36:26.555286 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546907 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:36:26.555286 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546910 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:36:26.555286 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546913 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:36:26.555286 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.546916 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:36:26.555286 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.547616 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:36:26.555422 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.555354 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 21:36:26.555422 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.555370 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555438 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555443 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555446 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555449 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555452 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555455 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555457 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555460 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555463 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555465 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555469 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555472 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555474 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555477 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:36:26.555474 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555480 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555483 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555486 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555489 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555491 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555495 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555498 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555500 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555503 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555506 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555508 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555511 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555513 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555516 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555518 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555521 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555523 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555526 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555536 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555539 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:36:26.555837 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555542 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555544 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555547 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555549 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555552 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555555 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555557 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555559 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555562 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555564 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555567 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555569 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555572 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555575 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555579 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555582 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555585 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555587 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555591 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555594 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:36:26.556337 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555597 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555600 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555602 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555606 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555611 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555616 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555620 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555623 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555626 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555629 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555631 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555634 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555637 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555639 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555642 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555645 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555648 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555651 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555653 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:36:26.556835 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555656 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555658 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555661 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555663 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555666 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555668 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555671 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555674 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555677 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555680 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555682 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555685 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555689 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.555694 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555816 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:36:26.557319 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555820 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555823 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555826 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555829 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555832 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555836 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555840 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555843 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555846 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555849 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555852 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555854 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555857 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555859 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555862 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555864 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555867 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555869 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555872 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:36:26.557733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555874 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555877 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555879 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555882 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555884 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555887 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555890 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555893 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555895 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555898 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555901 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555909 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555911 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555914 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555917 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555919 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555921 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555924 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555926 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555929 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:36:26.558219 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555931 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555934 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555936 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555939 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555941 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555944 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555946 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555949 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555951 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555954 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555957 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555959 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555962 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555964 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555967 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555969 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555972 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555975 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555978 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555980 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:36:26.558705 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555983 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555985 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555988 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555990 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.555999 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556002 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556004 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556008 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556011 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556013 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556016 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556018 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556021 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556023 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556025 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556028 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556031 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556033 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556035 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:36:26.559301 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556038 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:36:26.559787 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556040 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:36:26.559787 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556043 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:36:26.559787 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556045 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:36:26.559787 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556048 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:36:26.559787 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556050 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:36:26.559787 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:26.556053 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:36:26.559787 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.556058 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:36:26.559787 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.556907 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 21:36:26.560216 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.560201 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 21:36:26.561388 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.561377 2576 server.go:1019] "Starting client certificate rotation" Apr 17 21:36:26.561498 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.561480 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 21:36:26.561532 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.561525 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 21:36:26.590763 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.590739 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 21:36:26.594923 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.594887 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 21:36:26.612324 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.612296 2576 log.go:25] "Validated CRI v1 runtime API" Apr 17 21:36:26.619588 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.619564 2576 log.go:25] "Validated CRI v1 image API" Apr 17 21:36:26.621713 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.621695 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 21:36:26.623453 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.623436 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 21:36:26.630045 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.630018 2576 fs.go:135] Filesystem UUIDs: map[59d2c9d9-d59f-4545-a6b9-8f05116855cf:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 ee522931-72d9-4287-87b8-1954e625ef6e:/dev/nvme0n1p4] Apr 17 21:36:26.630131 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.630044 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 21:36:26.636268 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.636150 2576 manager.go:217] Machine: {Timestamp:2026-04-17 21:36:26.634259737 +0000 UTC m=+0.468147656 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098634 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d65a5b16c3919cb4ca856be1942ce SystemUUID:ec2d65a5-b16c-3919-cb4c-a856be1942ce BootID:81cce16e-9274-4db3-a8d3-e7a1a419e790 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:27:75:d5:46:31 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:27:75:d5:46:31 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8a:9f:15:47:95:c7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 21:36:26.636268 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.636262 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 21:36:26.636399 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.636387 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 21:36:26.637659 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.637634 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 21:36:26.637800 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.637663 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-47.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 21:36:26.637840 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.637809 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 21:36:26.637840 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.637818 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 21:36:26.637840 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.637830 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 21:36:26.638751 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.638741 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 21:36:26.640020 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.640010 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 21:36:26.640150 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.640141 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 21:36:26.643423 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.643414 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 17 21:36:26.643461 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.643432 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 21:36:26.643461 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.643444 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 21:36:26.643461 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.643456 2576 kubelet.go:397] "Adding apiserver pod source" Apr 17 21:36:26.643551 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.643464 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 21:36:26.644795 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.644781 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 21:36:26.644856 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.644799 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 21:36:26.649878 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.649348 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 21:36:26.651515 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.651496 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 21:36:26.653684 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653666 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 21:36:26.653684 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653687 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 21:36:26.653775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653694 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 21:36:26.653775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653700 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 21:36:26.653775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653706 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 21:36:26.653775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653712 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 21:36:26.653775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653717 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 21:36:26.653775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653725 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 21:36:26.653775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653736 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 21:36:26.653775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653743 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 21:36:26.653775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653752 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 21:36:26.653775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.653761 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 21:36:26.655777 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.655762 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 21:36:26.655777 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.655775 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 21:36:26.658163 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.658135 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 21:36:26.658163 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.658156 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 21:36:26.659737 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.659720 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 21:36:26.659828 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.659766 2576 server.go:1295] "Started kubelet" Apr 17 21:36:26.659910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.659848 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 21:36:26.659950 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.659896 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 21:36:26.659950 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.659942 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 21:36:26.660576 ip-10-0-141-47 systemd[1]: Started Kubernetes Kubelet. Apr 17 21:36:26.661479 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.661462 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 21:36:26.662997 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.662984 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 17 21:36:26.664290 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.664271 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zw8q5" Apr 17 21:36:26.668711 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.668691 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-47.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 21:36:26.669106 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.669070 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zw8q5" Apr 17 21:36:26.669739 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.668672 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-47.ec2.internal.18a7429536e8c507 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-47.ec2.internal,UID:ip-10-0-141-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-47.ec2.internal,},FirstTimestamp:2026-04-17 21:36:26.659734791 +0000 UTC m=+0.493622710,LastTimestamp:2026-04-17 21:36:26.659734791 +0000 UTC m=+0.493622710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-47.ec2.internal,}" Apr 17 21:36:26.669968 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.669944 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 21:36:26.670639 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.670624 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 21:36:26.671516 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671495 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 21:36:26.671516 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671517 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 21:36:26.671680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671518 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 21:36:26.671680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671548 2576 factory.go:55] Registering systemd factory Apr 17 21:36:26.671680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671558 2576 factory.go:223] Registration of the systemd container factory successfully Apr 17 21:36:26.671680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671597 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 21:36:26.671680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671667 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 17 21:36:26.671680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671677 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 17 21:36:26.671680 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.671659 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 21:36:26.671926 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671744 2576 factory.go:153] Registering CRI-O factory Apr 17 21:36:26.671926 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671753 2576 factory.go:223] Registration of the crio container factory successfully Apr 17 21:36:26.671926 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671784 2576 factory.go:103] Registering Raw factory Apr 17 21:36:26.671926 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.671804 2576 manager.go:1196] Started watching for new ooms in manager Apr 17 21:36:26.672115 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.672028 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:26.672395 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.672375 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 21:36:26.672470 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.672399 2576 manager.go:319] Starting recovery of all containers Apr 17 21:36:26.672528 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.672509 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 21:36:26.684608 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.684585 2576 manager.go:324] Recovery completed Apr 17 21:36:26.688772 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.688759 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:26.691637 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.691620 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:26.691691 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.691650 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:26.691691 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.691661 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:26.692193 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.692177 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 21:36:26.692193 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.692191 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 21:36:26.692299 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.692212 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 21:36:26.696344 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.696331 2576 policy_none.go:49] "None policy: Start" Apr 17 21:36:26.696410 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.696348 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 21:36:26.696410 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.696358 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.737700 2576 manager.go:341] "Starting Device Plugin manager" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.737753 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.737767 2576 server.go:85] "Starting device plugin registration server" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.738069 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.738100 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.738198 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.738296 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.738305 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.738731 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.739759 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.739807 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.740014 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.740041 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.740060 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.740070 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.740174 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 21:36:26.750132 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.743354 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:26.838694 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.838601 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:26.839638 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.839621 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:26.839717 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.839655 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:26.839717 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.839668 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:26.839717 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.839694 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.840739 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.840711 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal"] Apr 17 21:36:26.840852 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.840776 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:26.842328 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.842302 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:26.842399 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.842341 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:26.842399 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.842351 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:26.844600 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.844587 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:26.844728 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.844712 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.844782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.844757 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:26.845317 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.845294 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:26.845397 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.845335 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:26.845397 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.845341 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:26.845397 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.845364 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:26.845397 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.845373 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:26.845526 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.845349 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:26.847566 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.847551 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.847630 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.847576 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:26.848105 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.848070 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.848206 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.848111 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-47.ec2.internal\": node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:26.848328 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.848314 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:26.848387 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.848355 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:26.848431 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.848392 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:26.864365 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.864342 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:26.866828 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.866806 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-47.ec2.internal\" not found" node="ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.871053 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.871038 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-47.ec2.internal\" not found" node="ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.873199 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.873180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1b9bf094314e2e37bb2a0096c18823b3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal\" (UID: \"1b9bf094314e2e37bb2a0096c18823b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.873271 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.873209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b9bf094314e2e37bb2a0096c18823b3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal\" (UID: \"1b9bf094314e2e37bb2a0096c18823b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.873271 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.873249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/243afe9ca8e3caca522246c35745403e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-47.ec2.internal\" (UID: \"243afe9ca8e3caca522246c35745403e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.964953 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:26.964924 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:26.974327 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.974303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1b9bf094314e2e37bb2a0096c18823b3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal\" (UID: \"1b9bf094314e2e37bb2a0096c18823b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.974432 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.974333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b9bf094314e2e37bb2a0096c18823b3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal\" (UID: \"1b9bf094314e2e37bb2a0096c18823b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.974432 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.974357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/243afe9ca8e3caca522246c35745403e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-47.ec2.internal\" (UID: \"243afe9ca8e3caca522246c35745403e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.974432 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.974389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1b9bf094314e2e37bb2a0096c18823b3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal\" (UID: \"1b9bf094314e2e37bb2a0096c18823b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.974432 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.974397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b9bf094314e2e37bb2a0096c18823b3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal\" (UID: \"1b9bf094314e2e37bb2a0096c18823b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" Apr 17 21:36:26.974432 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:26.974392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/243afe9ca8e3caca522246c35745403e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-47.ec2.internal\" (UID: \"243afe9ca8e3caca522246c35745403e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal" Apr 17 21:36:27.065589 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:27.065531 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:27.166365 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:27.166298 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:27.169486 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.169465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" Apr 17 21:36:27.174533 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.174514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal" Apr 17 21:36:27.266820 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:27.266778 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:27.367311 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:27.367272 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:27.467906 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:27.467834 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:27.561354 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.561321 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 21:36:27.562003 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.561481 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 21:36:27.568458 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:27.568434 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:27.668779 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:27.668739 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:27.670546 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.670520 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 21:36:27.671843 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.671813 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 21:31:26 +0000 UTC" deadline="2027-12-10 17:31:02.593454037 +0000 UTC" Apr 17 21:36:27.671922 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.671843 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14443h54m34.92161436s" Apr 17 21:36:27.672102 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.672091 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:27.679860 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.679842 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 21:36:27.699482 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.699461 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2shv8" Apr 17 21:36:27.707315 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.707294 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2shv8" Apr 17 21:36:27.715973 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.715955 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:27.769410 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:27.769232 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:27.869845 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:27.869804 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:27.893500 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:27.893453 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod243afe9ca8e3caca522246c35745403e.slice/crio-7004f840569d2234cca6237c7c4b461c0fcedd36dae41098b3f86945326575ed WatchSource:0}: Error finding container 7004f840569d2234cca6237c7c4b461c0fcedd36dae41098b3f86945326575ed: Status 404 returned error can't find the container with id 7004f840569d2234cca6237c7c4b461c0fcedd36dae41098b3f86945326575ed Apr 17 21:36:27.893739 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:27.893716 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b9bf094314e2e37bb2a0096c18823b3.slice/crio-87f52e4885a858cd8ef0550cef88d12c3c6fc61f5dfb09a15144d4c5d3aee4e6 WatchSource:0}: Error finding container 87f52e4885a858cd8ef0550cef88d12c3c6fc61f5dfb09a15144d4c5d3aee4e6: Status 404 returned error can't find the container with id 87f52e4885a858cd8ef0550cef88d12c3c6fc61f5dfb09a15144d4c5d3aee4e6 Apr 17 21:36:27.897883 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:27.897869 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:36:27.970594 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:27.970555 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:28.071180 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:28.071053 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:28.171788 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:28.171742 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-47.ec2.internal\" not found" Apr 17 21:36:28.250551 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.250522 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:28.271536 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.271510 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" Apr 17 21:36:28.287880 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.287851 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 21:36:28.288951 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.288936 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal" Apr 17 21:36:28.297293 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.297271 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 21:36:28.645093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.644938 2576 apiserver.go:52] "Watching apiserver" Apr 17 21:36:28.653649 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.653621 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 21:36:28.654268 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.654171 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-7srzb","openshift-ovn-kubernetes/ovnkube-node-bhdzs","kube-system/konnectivity-agent-bhgcd","kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6","openshift-cluster-node-tuning-operator/tuned-55hgh","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal","openshift-multus/multus-kf8f6","openshift-multus/network-metrics-daemon-lncjj","openshift-dns/node-resolver-gjnvl","openshift-image-registry/node-ca-4jhjv","openshift-multus/multus-additional-cni-plugins-b5rqj","openshift-network-diagnostics/network-check-target-jlszn"] Apr 17 21:36:28.659693 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.659674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.659808 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.659727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.662751 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.661978 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:28.662751 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:28.662048 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:28.662751 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.662236 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jqp4x\"" Apr 17 21:36:28.662751 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.662333 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 21:36:28.662751 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.662575 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 21:36:28.663049 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.662846 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 21:36:28.663049 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.663010 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 21:36:28.663929 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.663274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-757tt\"" Apr 17 21:36:28.663929 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.663848 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 21:36:28.663929 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.663872 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:36:28.664386 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.664209 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 21:36:28.664504 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.664463 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 21:36:28.669673 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.669594 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:28.671656 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:28.669930 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:28.674888 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.674536 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.677229 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.676981 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 21:36:28.677369 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.677329 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.677539 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.677519 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 21:36:28.677778 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.677762 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 21:36:28.677854 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.677803 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5lft9\"" Apr 17 21:36:28.677854 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.677822 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 21:36:28.679786 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.679449 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:28.680538 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.680132 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 21:36:28.680538 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.680147 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 21:36:28.680538 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.680397 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mkjg4\"" Apr 17 21:36:28.681626 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.681582 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 21:36:28.682299 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.682095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:28.682389 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.682367 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 21:36:28.682602 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.682574 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mk7h2\"" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.682814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-etc-kubernetes\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.682846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.682875 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-run-openvswitch\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.682900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-node-log\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.682941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-run-netns\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.682987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-modprobe-d\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-run\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6mw2\" (UniqueName: \"kubernetes.io/projected/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-kube-api-access-j6mw2\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-cni-dir\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-os-release\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-daemon-config\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-kubelet\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-cnibin\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-os-release\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-run-netns\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.683456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-var-lib-kubelet\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-cni-binary-copy\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683496 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-var-lib-kubelet\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683559 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/096937b2-0789-4eb0-a35d-1a44c37d72dd-ovn-node-metrics-cert\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-system-cni-dir\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxfgz\" (UniqueName: \"kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz\") pod \"network-check-target-jlszn\" (UID: \"9a699952-32f8-4727-bb72-f047e0297d2f\") " pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-var-lib-cni-multus\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnct7\" (UniqueName: \"kubernetes.io/projected/697e918d-013d-41df-9440-059bd3d99a19-kube-api-access-dnct7\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-systemd-units\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-sysconfig\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-systemd\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-lib-modules\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-socket-dir-parent\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-hostroot\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.684179 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-etc-openvswitch\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-sysctl-d\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.683995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-tuned\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-tmp\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-run-k8s-cni-cncf-io\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-conf-dir\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-run-multus-certs\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcdj6\" (UniqueName: \"kubernetes.io/projected/e09621e2-cdbd-4f6e-8317-db02abbe345a-kube-api-access-zcdj6\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/096937b2-0789-4eb0-a35d-1a44c37d72dd-env-overrides\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd2ht\" (UniqueName: \"kubernetes.io/projected/096937b2-0789-4eb0-a35d-1a44c37d72dd-kube-api-access-jd2ht\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-kubernetes\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e09621e2-cdbd-4f6e-8317-db02abbe345a-cni-binary-copy\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-run-systemd\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-sys\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-cnibin\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-slash\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.684836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/096937b2-0789-4eb0-a35d-1a44c37d72dd-ovnkube-config\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/096937b2-0789-4eb0-a35d-1a44c37d72dd-ovnkube-script-lib\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-system-cni-dir\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q2lq\" (UniqueName: \"kubernetes.io/projected/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-kube-api-access-2q2lq\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-var-lib-openvswitch\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-run-ovn\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-host\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-var-lib-cni-bin\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-cni-bin\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-cni-netd\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-log-socket\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-sysctl-conf\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.684935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.685371 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.685425 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 21:36:28.685597 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.685573 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-68vhj\"" Apr 17 21:36:28.686345 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.685625 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:36:28.687509 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.687487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:28.687832 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.687815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.688953 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.688935 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 21:36:28.689172 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.689148 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-t6ql7\"" Apr 17 21:36:28.689350 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.689332 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 21:36:28.690373 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.690358 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 21:36:28.690592 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.690573 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9pl4m\"" Apr 17 21:36:28.690592 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.690587 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 21:36:28.690732 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.690628 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 21:36:28.690732 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.690587 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-shwkf\"" Apr 17 21:36:28.690827 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.690773 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 21:36:28.690913 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.690844 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 21:36:28.691011 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.690976 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 21:36:28.710310 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.710148 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 21:31:27 +0000 UTC" deadline="2027-09-16 06:56:04.307317542 +0000 UTC" Apr 17 21:36:28.710310 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.710174 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12393h19m35.597147238s" Apr 17 21:36:28.749001 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.748902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal" event={"ID":"243afe9ca8e3caca522246c35745403e","Type":"ContainerStarted","Data":"7004f840569d2234cca6237c7c4b461c0fcedd36dae41098b3f86945326575ed"} Apr 17 21:36:28.751319 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.751292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" event={"ID":"1b9bf094314e2e37bb2a0096c18823b3","Type":"ContainerStarted","Data":"87f52e4885a858cd8ef0550cef88d12c3c6fc61f5dfb09a15144d4c5d3aee4e6"} Apr 17 21:36:28.772537 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.772490 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 21:36:28.785223 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/97005141-bf7c-468e-b4dd-e370f7fce68d-iptables-alerter-script\") pod \"iptables-alerter-7srzb\" (UID: \"97005141-bf7c-468e-b4dd-e370f7fce68d\") " pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:28.785223 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-device-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.785419 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-var-lib-cni-multus\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.785419 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnct7\" (UniqueName: \"kubernetes.io/projected/697e918d-013d-41df-9440-059bd3d99a19-kube-api-access-dnct7\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:28.785419 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-systemd-units\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.785419 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-sysconfig\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.785419 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-systemd\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.785419 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-lib-modules\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.785419 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-socket-dir-parent\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.785419 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-hostroot\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-etc-openvswitch\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-sysctl-d\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-tuned\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-tmp\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svmtx\" (UniqueName: \"kubernetes.io/projected/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-kube-api-access-svmtx\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785595 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-run-k8s-cni-cncf-io\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-conf-dir\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785649 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-run-multus-certs\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcdj6\" (UniqueName: \"kubernetes.io/projected/e09621e2-cdbd-4f6e-8317-db02abbe345a-kube-api-access-zcdj6\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/096937b2-0789-4eb0-a35d-1a44c37d72dd-env-overrides\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd2ht\" (UniqueName: \"kubernetes.io/projected/096937b2-0789-4eb0-a35d-1a44c37d72dd-kube-api-access-jd2ht\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.785762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-kubernetes\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e09621e2-cdbd-4f6e-8317-db02abbe345a-cni-binary-copy\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-run-systemd\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-sys\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7f2z\" (UniqueName: \"kubernetes.io/projected/97005141-bf7c-468e-b4dd-e370f7fce68d-kube-api-access-m7f2z\") pod \"iptables-alerter-7srzb\" (UID: \"97005141-bf7c-468e-b4dd-e370f7fce68d\") " pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d6a476e-116f-4845-8146-b9bc2af9d504-tmp-dir\") pod \"node-resolver-gjnvl\" (UID: \"3d6a476e-116f-4845-8146-b9bc2af9d504\") " pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-cnibin\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-slash\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/096937b2-0789-4eb0-a35d-1a44c37d72dd-ovnkube-config\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.785978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/096937b2-0789-4eb0-a35d-1a44c37d72dd-ovnkube-script-lib\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d6a476e-116f-4845-8146-b9bc2af9d504-hosts-file\") pod \"node-resolver-gjnvl\" (UID: \"3d6a476e-116f-4845-8146-b9bc2af9d504\") " pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02310a72-3db5-42e8-b257-0ccf87bb8deb-host\") pod \"node-ca-4jhjv\" (UID: \"02310a72-3db5-42e8-b257-0ccf87bb8deb\") " pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-system-cni-dir\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2q2lq\" (UniqueName: \"kubernetes.io/projected/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-kube-api-access-2q2lq\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-var-lib-openvswitch\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-run-ovn\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.786362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-host\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786218 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-var-lib-cni-bin\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-cni-bin\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-cni-netd\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vns5\" (UniqueName: \"kubernetes.io/projected/3d6a476e-116f-4845-8146-b9bc2af9d504-kube-api-access-9vns5\") pod \"node-resolver-gjnvl\" (UID: \"3d6a476e-116f-4845-8146-b9bc2af9d504\") " pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-log-socket\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-sysctl-conf\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-etc-selinux\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-etc-kubernetes\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-run-openvswitch\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786545 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-node-log\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97005141-bf7c-468e-b4dd-e370f7fce68d-host-slash\") pod \"iptables-alerter-7srzb\" (UID: \"97005141-bf7c-468e-b4dd-e370f7fce68d\") " pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8e6dcb6f-5562-4043-bca2-51c52997f079-konnectivity-ca\") pod \"konnectivity-agent-bhgcd\" (UID: \"8e6dcb6f-5562-4043-bca2-51c52997f079\") " pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6vgx\" (UniqueName: \"kubernetes.io/projected/02310a72-3db5-42e8-b257-0ccf87bb8deb-kube-api-access-q6vgx\") pod \"node-ca-4jhjv\" (UID: \"02310a72-3db5-42e8-b257-0ccf87bb8deb\") " pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-run-netns\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.787093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-modprobe-d\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-run\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6mw2\" (UniqueName: \"kubernetes.io/projected/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-kube-api-access-j6mw2\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-cni-dir\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-os-release\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-daemon-config\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-kubelet\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-socket-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-cnibin\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786940 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-os-release\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-registration-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.786997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-run-netns\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-var-lib-kubelet\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-cni-binary-copy\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-var-lib-kubelet\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.787782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/096937b2-0789-4eb0-a35d-1a44c37d72dd-ovn-node-metrics-cert\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-sys-fs\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8e6dcb6f-5562-4043-bca2-51c52997f079-agent-certs\") pod \"konnectivity-agent-bhgcd\" (UID: \"8e6dcb6f-5562-4043-bca2-51c52997f079\") " pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02310a72-3db5-42e8-b257-0ccf87bb8deb-serviceca\") pod \"node-ca-4jhjv\" (UID: \"02310a72-3db5-42e8-b257-0ccf87bb8deb\") " pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-system-cni-dir\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfgz\" (UniqueName: \"kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz\") pod \"network-check-target-jlszn\" (UID: \"9a699952-32f8-4727-bb72-f047e0297d2f\") " pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-var-lib-cni-multus\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.787959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-systemd-units\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788022 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-sysconfig\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-systemd\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-lib-modules\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-socket-dir-parent\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.788396 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-hostroot\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.788740 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.788740 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-log-socket\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.788740 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.788740 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-os-release\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.788740 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788653 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-sysctl-d\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.788740 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-sysctl-conf\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.788934 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-etc-kubernetes\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.788934 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:28.788876 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:28.789016 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.788951 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 21:36:28.789052 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-run-netns\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.789107 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-run-openvswitch\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.789153 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-run\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.789197 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-node-log\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.789197 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.789293 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-modprobe-d\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.789293 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789261 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-cnibin\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.789293 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-var-lib-kubelet\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.789425 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-os-release\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.789425 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-kubelet\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.789425 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-run-netns\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.789425 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-var-lib-kubelet\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-system-cni-dir\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-cnibin\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789737 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-slash\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:28.789815 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs podName:697e918d-013d-41df-9440-059bd3d99a19 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:29.288939732 +0000 UTC m=+3.122827660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs") pod "network-metrics-daemon-lncjj" (UID: "697e918d-013d-41df-9440-059bd3d99a19") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.789988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-system-cni-dir\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-cni-dir\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-etc-openvswitch\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/096937b2-0789-4eb0-a35d-1a44c37d72dd-ovnkube-config\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-cni-binary-copy\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790406 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-host\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-var-lib-cni-bin\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-cni-bin\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-conf-dir\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-var-lib-openvswitch\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-run-ovn\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.791202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-run-multus-certs\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.792025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/096937b2-0789-4eb0-a35d-1a44c37d72dd-ovnkube-script-lib\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.792025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e09621e2-cdbd-4f6e-8317-db02abbe345a-multus-daemon-config\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.792025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-sys\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.792025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-run-systemd\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.792025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.790947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e09621e2-cdbd-4f6e-8317-db02abbe345a-host-run-k8s-cni-cncf-io\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.792025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.791048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/096937b2-0789-4eb0-a35d-1a44c37d72dd-env-overrides\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.792025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.791125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/096937b2-0789-4eb0-a35d-1a44c37d72dd-host-cni-netd\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.792025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.791176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-kubernetes\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.792025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.791386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e09621e2-cdbd-4f6e-8317-db02abbe345a-cni-binary-copy\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.792025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.791476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.792508 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:28.792380 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:28.792508 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:28.792401 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:28.792508 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:28.792413 2576 projected.go:194] Error preparing data for projected volume kube-api-access-lxfgz for pod openshift-network-diagnostics/network-check-target-jlszn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:28.792508 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:28.792472 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz podName:9a699952-32f8-4727-bb72-f047e0297d2f nodeName:}" failed. No retries permitted until 2026-04-17 21:36:29.292455354 +0000 UTC m=+3.126343283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-lxfgz" (UniqueName: "kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz") pod "network-check-target-jlszn" (UID: "9a699952-32f8-4727-bb72-f047e0297d2f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:28.794235 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.794211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-etc-tuned\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.794630 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.794607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/096937b2-0789-4eb0-a35d-1a44c37d72dd-ovn-node-metrics-cert\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.795853 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.795810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-tmp\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.796212 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.796190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnct7\" (UniqueName: \"kubernetes.io/projected/697e918d-013d-41df-9440-059bd3d99a19-kube-api-access-dnct7\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:28.801005 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.800982 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd2ht\" (UniqueName: \"kubernetes.io/projected/096937b2-0789-4eb0-a35d-1a44c37d72dd-kube-api-access-jd2ht\") pod \"ovnkube-node-bhdzs\" (UID: \"096937b2-0789-4eb0-a35d-1a44c37d72dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.801946 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.801876 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcdj6\" (UniqueName: \"kubernetes.io/projected/e09621e2-cdbd-4f6e-8317-db02abbe345a-kube-api-access-zcdj6\") pod \"multus-kf8f6\" (UID: \"e09621e2-cdbd-4f6e-8317-db02abbe345a\") " pod="openshift-multus/multus-kf8f6" Apr 17 21:36:28.802764 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.802743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6mw2\" (UniqueName: \"kubernetes.io/projected/bd21dfdb-2ef5-4eff-a466-0b6e7c28f977-kube-api-access-j6mw2\") pod \"tuned-55hgh\" (UID: \"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977\") " pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.803443 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.803404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q2lq\" (UniqueName: \"kubernetes.io/projected/996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e-kube-api-access-2q2lq\") pod \"multus-additional-cni-plugins-b5rqj\" (UID: \"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e\") " pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:28.815622 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.815597 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:28.888525 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-socket-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.888525 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-registration-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.888766 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-sys-fs\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.888766 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8e6dcb6f-5562-4043-bca2-51c52997f079-agent-certs\") pod \"konnectivity-agent-bhgcd\" (UID: \"8e6dcb6f-5562-4043-bca2-51c52997f079\") " pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:28.888766 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-sys-fs\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.888766 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-registration-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.888766 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02310a72-3db5-42e8-b257-0ccf87bb8deb-serviceca\") pod \"node-ca-4jhjv\" (UID: \"02310a72-3db5-42e8-b257-0ccf87bb8deb\") " pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:28.888766 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/97005141-bf7c-468e-b4dd-e370f7fce68d-iptables-alerter-script\") pod \"iptables-alerter-7srzb\" (UID: \"97005141-bf7c-468e-b4dd-e370f7fce68d\") " pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:28.888766 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-socket-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.889119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-device-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.889119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-device-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.889119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svmtx\" (UniqueName: \"kubernetes.io/projected/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-kube-api-access-svmtx\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.889119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7f2z\" (UniqueName: \"kubernetes.io/projected/97005141-bf7c-468e-b4dd-e370f7fce68d-kube-api-access-m7f2z\") pod \"iptables-alerter-7srzb\" (UID: \"97005141-bf7c-468e-b4dd-e370f7fce68d\") " pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:28.889119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d6a476e-116f-4845-8146-b9bc2af9d504-tmp-dir\") pod \"node-resolver-gjnvl\" (UID: \"3d6a476e-116f-4845-8146-b9bc2af9d504\") " pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:28.889119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.889119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d6a476e-116f-4845-8146-b9bc2af9d504-hosts-file\") pod \"node-resolver-gjnvl\" (UID: \"3d6a476e-116f-4845-8146-b9bc2af9d504\") " pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:28.889119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.888992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02310a72-3db5-42e8-b257-0ccf87bb8deb-host\") pod \"node-ca-4jhjv\" (UID: \"02310a72-3db5-42e8-b257-0ccf87bb8deb\") " pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:28.889119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02310a72-3db5-42e8-b257-0ccf87bb8deb-host\") pod \"node-ca-4jhjv\" (UID: \"02310a72-3db5-42e8-b257-0ccf87bb8deb\") " pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:28.889119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889116 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/02310a72-3db5-42e8-b257-0ccf87bb8deb-serviceca\") pod \"node-ca-4jhjv\" (UID: \"02310a72-3db5-42e8-b257-0ccf87bb8deb\") " pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3d6a476e-116f-4845-8146-b9bc2af9d504-hosts-file\") pod \"node-resolver-gjnvl\" (UID: \"3d6a476e-116f-4845-8146-b9bc2af9d504\") " pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/97005141-bf7c-468e-b4dd-e370f7fce68d-iptables-alerter-script\") pod \"iptables-alerter-7srzb\" (UID: \"97005141-bf7c-468e-b4dd-e370f7fce68d\") " pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vns5\" (UniqueName: \"kubernetes.io/projected/3d6a476e-116f-4845-8146-b9bc2af9d504-kube-api-access-9vns5\") pod \"node-resolver-gjnvl\" (UID: \"3d6a476e-116f-4845-8146-b9bc2af9d504\") " pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-etc-selinux\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97005141-bf7c-468e-b4dd-e370f7fce68d-host-slash\") pod \"iptables-alerter-7srzb\" (UID: \"97005141-bf7c-468e-b4dd-e370f7fce68d\") " pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3d6a476e-116f-4845-8146-b9bc2af9d504-tmp-dir\") pod \"node-resolver-gjnvl\" (UID: \"3d6a476e-116f-4845-8146-b9bc2af9d504\") " pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97005141-bf7c-468e-b4dd-e370f7fce68d-host-slash\") pod \"iptables-alerter-7srzb\" (UID: \"97005141-bf7c-468e-b4dd-e370f7fce68d\") " pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-etc-selinux\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8e6dcb6f-5562-4043-bca2-51c52997f079-konnectivity-ca\") pod \"konnectivity-agent-bhgcd\" (UID: \"8e6dcb6f-5562-4043-bca2-51c52997f079\") " pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:28.889640 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6vgx\" (UniqueName: \"kubernetes.io/projected/02310a72-3db5-42e8-b257-0ccf87bb8deb-kube-api-access-q6vgx\") pod \"node-ca-4jhjv\" (UID: \"02310a72-3db5-42e8-b257-0ccf87bb8deb\") " pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:28.890031 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.889973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8e6dcb6f-5562-4043-bca2-51c52997f079-konnectivity-ca\") pod \"konnectivity-agent-bhgcd\" (UID: \"8e6dcb6f-5562-4043-bca2-51c52997f079\") " pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:28.891489 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.891462 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8e6dcb6f-5562-4043-bca2-51c52997f079-agent-certs\") pod \"konnectivity-agent-bhgcd\" (UID: \"8e6dcb6f-5562-4043-bca2-51c52997f079\") " pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:28.897037 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.897012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6vgx\" (UniqueName: \"kubernetes.io/projected/02310a72-3db5-42e8-b257-0ccf87bb8deb-kube-api-access-q6vgx\") pod \"node-ca-4jhjv\" (UID: \"02310a72-3db5-42e8-b257-0ccf87bb8deb\") " pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:28.897228 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.897206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vns5\" (UniqueName: \"kubernetes.io/projected/3d6a476e-116f-4845-8146-b9bc2af9d504-kube-api-access-9vns5\") pod \"node-resolver-gjnvl\" (UID: \"3d6a476e-116f-4845-8146-b9bc2af9d504\") " pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:28.897853 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.897837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7f2z\" (UniqueName: \"kubernetes.io/projected/97005141-bf7c-468e-b4dd-e370f7fce68d-kube-api-access-m7f2z\") pod \"iptables-alerter-7srzb\" (UID: \"97005141-bf7c-468e-b4dd-e370f7fce68d\") " pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:28.898068 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.898043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svmtx\" (UniqueName: \"kubernetes.io/projected/78090d91-9d9e-4fdc-ae8c-1ad4175f499c-kube-api-access-svmtx\") pod \"aws-ebs-csi-driver-node-xdhm6\" (UID: \"78090d91-9d9e-4fdc-ae8c-1ad4175f499c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:28.977145 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.977109 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:28.986050 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.986026 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-55hgh" Apr 17 21:36:28.995724 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:28.995694 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kf8f6" Apr 17 21:36:29.002397 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.002338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" Apr 17 21:36:29.014942 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.014919 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:29.027554 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.027517 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7srzb" Apr 17 21:36:29.045159 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.045127 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gjnvl" Apr 17 21:36:29.054773 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.054748 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4jhjv" Apr 17 21:36:29.061445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.061422 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" Apr 17 21:36:29.291213 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.291143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:29.291337 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:29.291258 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:29.291337 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:29.291308 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs podName:697e918d-013d-41df-9440-059bd3d99a19 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:30.291294251 +0000 UTC m=+4.125182156 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs") pod "network-metrics-daemon-lncjj" (UID: "697e918d-013d-41df-9440-059bd3d99a19") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:29.392149 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.392119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfgz\" (UniqueName: \"kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz\") pod \"network-check-target-jlszn\" (UID: \"9a699952-32f8-4727-bb72-f047e0297d2f\") " pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:29.392329 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:29.392287 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:29.392329 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:29.392309 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:29.392329 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:29.392327 2576 projected.go:194] Error preparing data for projected volume kube-api-access-lxfgz for pod openshift-network-diagnostics/network-check-target-jlszn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:29.392502 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:29.392391 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz podName:9a699952-32f8-4727-bb72-f047e0297d2f nodeName:}" failed. No retries permitted until 2026-04-17 21:36:30.392370974 +0000 UTC m=+4.226258884 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-lxfgz" (UniqueName: "kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz") pod "network-check-target-jlszn" (UID: "9a699952-32f8-4727-bb72-f047e0297d2f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:29.661349 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:29.661312 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode09621e2_cdbd_4f6e_8317_db02abbe345a.slice/crio-76b2dd8cdd759ceaa506e49cde4754d7875ce8c8cfc4b469b3513de53c23fe3d WatchSource:0}: Error finding container 76b2dd8cdd759ceaa506e49cde4754d7875ce8c8cfc4b469b3513de53c23fe3d: Status 404 returned error can't find the container with id 76b2dd8cdd759ceaa506e49cde4754d7875ce8c8cfc4b469b3513de53c23fe3d Apr 17 21:36:29.663147 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:29.663124 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod996e4d5d_ab1a_4fb7_99bd_6dd1589aac0e.slice/crio-326c71373cda254a185293b9fa93e20a7d21e870e8484ed21f7e888d76d13582 WatchSource:0}: Error finding container 326c71373cda254a185293b9fa93e20a7d21e870e8484ed21f7e888d76d13582: Status 404 returned error can't find the container with id 326c71373cda254a185293b9fa93e20a7d21e870e8484ed21f7e888d76d13582 Apr 17 21:36:29.669896 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:29.669867 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78090d91_9d9e_4fdc_ae8c_1ad4175f499c.slice/crio-463993f53c5accbb4f441e688f27666b7d471113f82bf3103fc45030cc243178 WatchSource:0}: Error finding container 463993f53c5accbb4f441e688f27666b7d471113f82bf3103fc45030cc243178: Status 404 returned error can't find the container with id 463993f53c5accbb4f441e688f27666b7d471113f82bf3103fc45030cc243178 Apr 17 21:36:29.671109 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:29.671067 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d6a476e_116f_4845_8146_b9bc2af9d504.slice/crio-4fa4c8ff40b2531af2e4df57877ab78c0dffa6ee12cbfb27596ee78cbfda1fce WatchSource:0}: Error finding container 4fa4c8ff40b2531af2e4df57877ab78c0dffa6ee12cbfb27596ee78cbfda1fce: Status 404 returned error can't find the container with id 4fa4c8ff40b2531af2e4df57877ab78c0dffa6ee12cbfb27596ee78cbfda1fce Apr 17 21:36:29.671823 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:29.671801 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e6dcb6f_5562_4043_bca2_51c52997f079.slice/crio-2dd2a6058ee23100ac6da10c3b04d857e109df8ceda96441ef1f37db7717f5fa WatchSource:0}: Error finding container 2dd2a6058ee23100ac6da10c3b04d857e109df8ceda96441ef1f37db7717f5fa: Status 404 returned error can't find the container with id 2dd2a6058ee23100ac6da10c3b04d857e109df8ceda96441ef1f37db7717f5fa Apr 17 21:36:29.672959 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:36:29.672737 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd21dfdb_2ef5_4eff_a466_0b6e7c28f977.slice/crio-ede092ffd22d74efe46d569040b96c08f748407e4a296e02d78a762369a95ec2 WatchSource:0}: Error finding container ede092ffd22d74efe46d569040b96c08f748407e4a296e02d78a762369a95ec2: Status 404 returned error can't find the container with id ede092ffd22d74efe46d569040b96c08f748407e4a296e02d78a762369a95ec2 Apr 17 21:36:29.710297 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.710268 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 21:31:27 +0000 UTC" deadline="2027-12-06 19:40:05.428619917 +0000 UTC" Apr 17 21:36:29.710297 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.710294 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14350h3m35.718327696s" Apr 17 21:36:29.741064 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.741037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:29.741231 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:29.741197 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:29.753242 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.753205 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" event={"ID":"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e","Type":"ContainerStarted","Data":"326c71373cda254a185293b9fa93e20a7d21e870e8484ed21f7e888d76d13582"} Apr 17 21:36:29.754114 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.754094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kf8f6" event={"ID":"e09621e2-cdbd-4f6e-8317-db02abbe345a","Type":"ContainerStarted","Data":"76b2dd8cdd759ceaa506e49cde4754d7875ce8c8cfc4b469b3513de53c23fe3d"} Apr 17 21:36:29.754934 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.754913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4jhjv" event={"ID":"02310a72-3db5-42e8-b257-0ccf87bb8deb","Type":"ContainerStarted","Data":"ecb321647eadfeba30579b83f2f86388e0c13c8d2e6b9018d346454f5a85dc7e"} Apr 17 21:36:29.755753 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.755716 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-55hgh" event={"ID":"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977","Type":"ContainerStarted","Data":"ede092ffd22d74efe46d569040b96c08f748407e4a296e02d78a762369a95ec2"} Apr 17 21:36:29.756593 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.756567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bhgcd" event={"ID":"8e6dcb6f-5562-4043-bca2-51c52997f079","Type":"ContainerStarted","Data":"2dd2a6058ee23100ac6da10c3b04d857e109df8ceda96441ef1f37db7717f5fa"} Apr 17 21:36:29.757408 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.757392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gjnvl" event={"ID":"3d6a476e-116f-4845-8146-b9bc2af9d504","Type":"ContainerStarted","Data":"4fa4c8ff40b2531af2e4df57877ab78c0dffa6ee12cbfb27596ee78cbfda1fce"} Apr 17 21:36:29.758266 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.758240 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7srzb" event={"ID":"97005141-bf7c-468e-b4dd-e370f7fce68d","Type":"ContainerStarted","Data":"cc0c3e86696b99c4e010cf5720830617926cacf91fce0a3ec805490af276be6a"} Apr 17 21:36:29.759173 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.759155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" event={"ID":"78090d91-9d9e-4fdc-ae8c-1ad4175f499c","Type":"ContainerStarted","Data":"463993f53c5accbb4f441e688f27666b7d471113f82bf3103fc45030cc243178"} Apr 17 21:36:29.760003 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:29.759985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" event={"ID":"096937b2-0789-4eb0-a35d-1a44c37d72dd","Type":"ContainerStarted","Data":"7377c6d182f58b48b36de24ec055e12b4df3e445db5ba2ef97102e709bf8d342"} Apr 17 21:36:30.299901 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:30.299537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:30.299901 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:30.299876 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:30.300154 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:30.299953 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs podName:697e918d-013d-41df-9440-059bd3d99a19 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:32.299932686 +0000 UTC m=+6.133820592 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs") pod "network-metrics-daemon-lncjj" (UID: "697e918d-013d-41df-9440-059bd3d99a19") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:30.402176 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:30.402136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfgz\" (UniqueName: \"kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz\") pod \"network-check-target-jlszn\" (UID: \"9a699952-32f8-4727-bb72-f047e0297d2f\") " pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:30.402355 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:30.402321 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:30.402355 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:30.402340 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:30.402355 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:30.402353 2576 projected.go:194] Error preparing data for projected volume kube-api-access-lxfgz for pod openshift-network-diagnostics/network-check-target-jlszn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:30.402539 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:30.402410 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz podName:9a699952-32f8-4727-bb72-f047e0297d2f nodeName:}" failed. No retries permitted until 2026-04-17 21:36:32.402393004 +0000 UTC m=+6.236280927 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-lxfgz" (UniqueName: "kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz") pod "network-check-target-jlszn" (UID: "9a699952-32f8-4727-bb72-f047e0297d2f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:30.743567 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:30.742731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:30.743567 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:30.742857 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:30.782398 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:30.782362 2576 generic.go:358] "Generic (PLEG): container finished" podID="1b9bf094314e2e37bb2a0096c18823b3" containerID="2eab2024d06d17230540597aca10e2c1cef35446a5869adb16029717adc64d5a" exitCode=0 Apr 17 21:36:30.788764 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:30.788728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" event={"ID":"1b9bf094314e2e37bb2a0096c18823b3","Type":"ContainerDied","Data":"2eab2024d06d17230540597aca10e2c1cef35446a5869adb16029717adc64d5a"} Apr 17 21:36:30.796064 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:30.795636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal" event={"ID":"243afe9ca8e3caca522246c35745403e","Type":"ContainerStarted","Data":"d77d34092d0ad87b883f7e74f94ee5ebc8e68eea2ed3830c02d4aa4b787ee690"} Apr 17 21:36:31.740454 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:31.740422 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:31.740628 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:31.740556 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:31.828159 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:31.828117 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" event={"ID":"1b9bf094314e2e37bb2a0096c18823b3","Type":"ContainerStarted","Data":"741607248ec6d2db4addc09257694d6645cc0d45dfe5a8deb77aeb3b7e4299ef"} Apr 17 21:36:31.841966 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:31.841588 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-47.ec2.internal" podStartSLOduration=3.841566761 podStartE2EDuration="3.841566761s" podCreationTimestamp="2026-04-17 21:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:36:30.814784057 +0000 UTC m=+4.648671986" watchObservedRunningTime="2026-04-17 21:36:31.841566761 +0000 UTC m=+5.675454693" Apr 17 21:36:32.318096 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:32.317961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:32.318284 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:32.318171 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:32.318284 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:32.318235 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs podName:697e918d-013d-41df-9440-059bd3d99a19 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:36.318217094 +0000 UTC m=+10.152105014 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs") pod "network-metrics-daemon-lncjj" (UID: "697e918d-013d-41df-9440-059bd3d99a19") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:32.419625 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:32.418899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfgz\" (UniqueName: \"kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz\") pod \"network-check-target-jlszn\" (UID: \"9a699952-32f8-4727-bb72-f047e0297d2f\") " pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:32.419625 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:32.419098 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:32.419625 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:32.419118 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:32.419625 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:32.419131 2576 projected.go:194] Error preparing data for projected volume kube-api-access-lxfgz for pod openshift-network-diagnostics/network-check-target-jlszn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:32.419625 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:32.419233 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz podName:9a699952-32f8-4727-bb72-f047e0297d2f nodeName:}" failed. No retries permitted until 2026-04-17 21:36:36.419212753 +0000 UTC m=+10.253100672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-lxfgz" (UniqueName: "kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz") pod "network-check-target-jlszn" (UID: "9a699952-32f8-4727-bb72-f047e0297d2f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:32.741878 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:32.741304 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:32.741878 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:32.741458 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:33.741221 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:33.741187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:33.741756 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:33.741340 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:34.741164 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:34.740632 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:34.741164 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:34.740780 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:35.741316 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:35.740742 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:35.741316 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:35.740893 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:36.351468 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:36.351429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:36.351701 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:36.351650 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:36.351774 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:36.351724 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs podName:697e918d-013d-41df-9440-059bd3d99a19 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:44.351703822 +0000 UTC m=+18.185591752 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs") pod "network-metrics-daemon-lncjj" (UID: "697e918d-013d-41df-9440-059bd3d99a19") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:36.452897 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:36.452827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfgz\" (UniqueName: \"kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz\") pod \"network-check-target-jlszn\" (UID: \"9a699952-32f8-4727-bb72-f047e0297d2f\") " pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:36.453053 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:36.452958 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:36.453053 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:36.452979 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:36.453053 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:36.452991 2576 projected.go:194] Error preparing data for projected volume kube-api-access-lxfgz for pod openshift-network-diagnostics/network-check-target-jlszn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:36.453053 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:36.453041 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz podName:9a699952-32f8-4727-bb72-f047e0297d2f nodeName:}" failed. No retries permitted until 2026-04-17 21:36:44.45302747 +0000 UTC m=+18.286915380 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-lxfgz" (UniqueName: "kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz") pod "network-check-target-jlszn" (UID: "9a699952-32f8-4727-bb72-f047e0297d2f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:36.743394 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:36.743321 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:36.743878 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:36.743438 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:37.741182 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:37.741151 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:37.741381 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:37.741289 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:38.743891 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:38.743864 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:38.744307 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:38.743971 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:39.740940 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:39.740904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:39.741151 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:39.741036 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:40.741224 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:40.741188 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:40.741747 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:40.741305 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:41.741185 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:41.741148 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:41.741360 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:41.741261 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:42.743243 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:42.743216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:42.743633 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:42.743316 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:43.740748 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:43.740711 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:43.740960 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:43.740825 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:43.911494 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:43.911437 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-47.ec2.internal" podStartSLOduration=15.911417379 podStartE2EDuration="15.911417379s" podCreationTimestamp="2026-04-17 21:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:36:31.840933521 +0000 UTC m=+5.674821450" watchObservedRunningTime="2026-04-17 21:36:43.911417379 +0000 UTC m=+17.745305391" Apr 17 21:36:43.912047 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:43.912028 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rkkmx"] Apr 17 21:36:43.922083 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:43.922049 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:43.922224 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:43.922134 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkkmx" podUID="f3915b71-644a-48b2-a22a-a629db33eec4" Apr 17 21:36:44.004361 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.004271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:44.004361 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.004330 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f3915b71-644a-48b2-a22a-a629db33eec4-dbus\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:44.004555 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.004394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f3915b71-644a-48b2-a22a-a629db33eec4-kubelet-config\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:44.104742 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.104703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:44.104910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.104761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f3915b71-644a-48b2-a22a-a629db33eec4-dbus\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:44.104910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.104786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f3915b71-644a-48b2-a22a-a629db33eec4-kubelet-config\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:44.104910 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.104885 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:44.104910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.104900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f3915b71-644a-48b2-a22a-a629db33eec4-kubelet-config\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:44.105164 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.104966 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret podName:f3915b71-644a-48b2-a22a-a629db33eec4 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:44.604949509 +0000 UTC m=+18.438837420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret") pod "global-pull-secret-syncer-rkkmx" (UID: "f3915b71-644a-48b2-a22a-a629db33eec4") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:44.105164 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.104991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f3915b71-644a-48b2-a22a-a629db33eec4-dbus\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:44.408146 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.408037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:44.408296 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.408195 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:44.408296 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.408269 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs podName:697e918d-013d-41df-9440-059bd3d99a19 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:00.408248019 +0000 UTC m=+34.242135951 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs") pod "network-metrics-daemon-lncjj" (UID: "697e918d-013d-41df-9440-059bd3d99a19") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:44.508763 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.508725 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfgz\" (UniqueName: \"kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz\") pod \"network-check-target-jlszn\" (UID: \"9a699952-32f8-4727-bb72-f047e0297d2f\") " pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:44.508922 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.508903 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:44.508999 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.508925 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:44.508999 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.508940 2576 projected.go:194] Error preparing data for projected volume kube-api-access-lxfgz for pod openshift-network-diagnostics/network-check-target-jlszn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:44.508999 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.508996 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz podName:9a699952-32f8-4727-bb72-f047e0297d2f nodeName:}" failed. No retries permitted until 2026-04-17 21:37:00.508981883 +0000 UTC m=+34.342869793 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-lxfgz" (UniqueName: "kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz") pod "network-check-target-jlszn" (UID: "9a699952-32f8-4727-bb72-f047e0297d2f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:44.610021 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.609967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:44.610199 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.610101 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:44.610199 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.610176 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret podName:f3915b71-644a-48b2-a22a-a629db33eec4 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:45.610160189 +0000 UTC m=+19.444048095 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret") pod "global-pull-secret-syncer-rkkmx" (UID: "f3915b71-644a-48b2-a22a-a629db33eec4") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:44.741067 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:44.740976 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:44.741244 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:44.741110 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:45.618382 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:45.618341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:45.618876 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:45.618492 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:45.618876 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:45.618570 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret podName:f3915b71-644a-48b2-a22a-a629db33eec4 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:47.618553562 +0000 UTC m=+21.452441488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret") pod "global-pull-secret-syncer-rkkmx" (UID: "f3915b71-644a-48b2-a22a-a629db33eec4") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:45.741129 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:45.741095 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:45.741317 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:45.741230 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkkmx" podUID="f3915b71-644a-48b2-a22a-a629db33eec4" Apr 17 21:36:45.741317 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:45.741293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:45.741440 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:45.741416 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:46.742912 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:46.742886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:46.742912 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:46.742899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:46.743370 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:46.742979 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkkmx" podUID="f3915b71-644a-48b2-a22a-a629db33eec4" Apr 17 21:36:46.743370 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:46.743134 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:47.633535 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.633254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:47.633535 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:47.633416 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:47.633739 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:47.633608 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret podName:f3915b71-644a-48b2-a22a-a629db33eec4 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:51.633587584 +0000 UTC m=+25.467475513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret") pod "global-pull-secret-syncer-rkkmx" (UID: "f3915b71-644a-48b2-a22a-a629db33eec4") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:47.740710 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.740688 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:47.740806 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:47.740791 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:47.857965 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.857933 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" event={"ID":"78090d91-9d9e-4fdc-ae8c-1ad4175f499c","Type":"ContainerStarted","Data":"a04a6550e1288dbe6d60c6166df98601e7b232005d28511d0dddfdde8b9b39f6"} Apr 17 21:36:47.859676 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.859651 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" event={"ID":"096937b2-0789-4eb0-a35d-1a44c37d72dd","Type":"ContainerStarted","Data":"c59e1dc29415d3206cd1313a8c916452ef7bf057e7d1a0c2456161719a550605"} Apr 17 21:36:47.859783 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.859685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" event={"ID":"096937b2-0789-4eb0-a35d-1a44c37d72dd","Type":"ContainerStarted","Data":"d0283f0f0e809ee4704e28d854665f95cd4ceef7d22f166a111fd6f802bd453f"} Apr 17 21:36:47.859783 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.859699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" event={"ID":"096937b2-0789-4eb0-a35d-1a44c37d72dd","Type":"ContainerStarted","Data":"588be314f9ac071d8107a20077802fba9dafe05e832292199f532b541fda2214"} Apr 17 21:36:47.861145 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.861116 2576 generic.go:358] "Generic (PLEG): container finished" podID="996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e" containerID="b13c28325fae394cb08e7f8bf7730694f2e7b5f14f65ea2c2ba6948462afa534" exitCode=0 Apr 17 21:36:47.861226 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.861194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" event={"ID":"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e","Type":"ContainerDied","Data":"b13c28325fae394cb08e7f8bf7730694f2e7b5f14f65ea2c2ba6948462afa534"} Apr 17 21:36:47.862744 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.862718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kf8f6" event={"ID":"e09621e2-cdbd-4f6e-8317-db02abbe345a","Type":"ContainerStarted","Data":"532ab580ac84cd831888045f4a2e3abd7770a767f8cac2d4c0dd488add14409a"} Apr 17 21:36:47.863910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.863889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4jhjv" event={"ID":"02310a72-3db5-42e8-b257-0ccf87bb8deb","Type":"ContainerStarted","Data":"5b5eda36cf623b7287c2eb29ffe7568bca1cff0ff1ae3f23d66ffde8b51472e0"} Apr 17 21:36:47.865035 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.865015 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-55hgh" event={"ID":"bd21dfdb-2ef5-4eff-a466-0b6e7c28f977","Type":"ContainerStarted","Data":"d9ca58621c8889817678a940a2bee820fd9b568511cbecd1dac605275cc36c0d"} Apr 17 21:36:47.866131 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.866107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bhgcd" event={"ID":"8e6dcb6f-5562-4043-bca2-51c52997f079","Type":"ContainerStarted","Data":"a84408d0909540ecf9bc42ab12b56a3b997c74a130bbe644825e036ca580b65e"} Apr 17 21:36:47.867259 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.867235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gjnvl" event={"ID":"3d6a476e-116f-4845-8146-b9bc2af9d504","Type":"ContainerStarted","Data":"f06a21dc2989d8829e0289713a4def2eee78d0b0fbcea192de4197c11e4dc3b7"} Apr 17 21:36:47.895189 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.895140 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kf8f6" podStartSLOduration=4.319990138 podStartE2EDuration="21.895119982s" podCreationTimestamp="2026-04-17 21:36:26 +0000 UTC" firstStartedPulling="2026-04-17 21:36:29.662822393 +0000 UTC m=+3.496710302" lastFinishedPulling="2026-04-17 21:36:47.237952226 +0000 UTC m=+21.071840146" observedRunningTime="2026-04-17 21:36:47.894792355 +0000 UTC m=+21.728680283" watchObservedRunningTime="2026-04-17 21:36:47.895119982 +0000 UTC m=+21.729007911" Apr 17 21:36:47.918191 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.918149 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-55hgh" podStartSLOduration=4.355568898 podStartE2EDuration="21.918135565s" podCreationTimestamp="2026-04-17 21:36:26 +0000 UTC" firstStartedPulling="2026-04-17 21:36:29.675581161 +0000 UTC m=+3.509469067" lastFinishedPulling="2026-04-17 21:36:47.238147815 +0000 UTC m=+21.072035734" observedRunningTime="2026-04-17 21:36:47.918062003 +0000 UTC m=+21.751949925" watchObservedRunningTime="2026-04-17 21:36:47.918135565 +0000 UTC m=+21.752023489" Apr 17 21:36:47.918424 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.918404 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4jhjv" podStartSLOduration=11.593877645 podStartE2EDuration="20.91839781s" podCreationTimestamp="2026-04-17 21:36:27 +0000 UTC" firstStartedPulling="2026-04-17 21:36:29.676619305 +0000 UTC m=+3.510507223" lastFinishedPulling="2026-04-17 21:36:39.001139469 +0000 UTC m=+12.835027388" observedRunningTime="2026-04-17 21:36:47.906022254 +0000 UTC m=+21.739910183" watchObservedRunningTime="2026-04-17 21:36:47.91839781 +0000 UTC m=+21.752285807" Apr 17 21:36:47.935426 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.935220 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bhgcd" podStartSLOduration=3.480205768 podStartE2EDuration="20.935202001s" podCreationTimestamp="2026-04-17 21:36:27 +0000 UTC" firstStartedPulling="2026-04-17 21:36:29.674033808 +0000 UTC m=+3.507921719" lastFinishedPulling="2026-04-17 21:36:47.12903004 +0000 UTC m=+20.962917952" observedRunningTime="2026-04-17 21:36:47.934551004 +0000 UTC m=+21.768438932" watchObservedRunningTime="2026-04-17 21:36:47.935202001 +0000 UTC m=+21.769089930" Apr 17 21:36:47.951137 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:47.951093 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gjnvl" podStartSLOduration=3.40642673 podStartE2EDuration="20.951059212s" podCreationTimestamp="2026-04-17 21:36:27 +0000 UTC" firstStartedPulling="2026-04-17 21:36:29.67298745 +0000 UTC m=+3.506875368" lastFinishedPulling="2026-04-17 21:36:47.217619937 +0000 UTC m=+21.051507850" observedRunningTime="2026-04-17 21:36:47.950817741 +0000 UTC m=+21.784705682" watchObservedRunningTime="2026-04-17 21:36:47.951059212 +0000 UTC m=+21.784947139" Apr 17 21:36:48.465624 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.465572 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 21:36:48.741187 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.741159 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:48.741460 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:48.741267 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkkmx" podUID="f3915b71-644a-48b2-a22a-a629db33eec4" Apr 17 21:36:48.741460 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.741298 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:48.741460 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:48.741348 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:48.754336 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.754242 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T21:36:48.465593119Z","UUID":"04791d05-66ff-4691-874a-4d0a1c9a2545","Handler":null,"Name":"","Endpoint":""} Apr 17 21:36:48.757163 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.757134 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 21:36:48.757163 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.757167 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 21:36:48.870670 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.870637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7srzb" event={"ID":"97005141-bf7c-468e-b4dd-e370f7fce68d","Type":"ContainerStarted","Data":"94f5febb8a29fcbd49d93d1f2a7b79a78ba1914cf1eb4abda2dbde28d048c6d1"} Apr 17 21:36:48.872233 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.872209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" event={"ID":"78090d91-9d9e-4fdc-ae8c-1ad4175f499c","Type":"ContainerStarted","Data":"20d96cb25fab5cf1f3496d447faa5d68e61d7852066458a8e3be5ede436f4656"} Apr 17 21:36:48.874617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.874590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" event={"ID":"096937b2-0789-4eb0-a35d-1a44c37d72dd","Type":"ContainerStarted","Data":"d8d2c58a50c32922cff04d71579ca6564f5055009dce0747325339dcad48213c"} Apr 17 21:36:48.874712 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.874624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" event={"ID":"096937b2-0789-4eb0-a35d-1a44c37d72dd","Type":"ContainerStarted","Data":"8f11c731173bbcdea985af1dcfdde4cdad4175dde1d3f6a5d07a583ad9449427"} Apr 17 21:36:48.874712 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.874638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" event={"ID":"096937b2-0789-4eb0-a35d-1a44c37d72dd","Type":"ContainerStarted","Data":"57667f6b58b1b77502e5590cc4c98ec0f8d03088b9e2b269c137bf8265b9066c"} Apr 17 21:36:48.884413 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:48.884371 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7srzb" podStartSLOduration=4.343190474 podStartE2EDuration="21.884358888s" podCreationTimestamp="2026-04-17 21:36:27 +0000 UTC" firstStartedPulling="2026-04-17 21:36:29.676468623 +0000 UTC m=+3.510356535" lastFinishedPulling="2026-04-17 21:36:47.217637028 +0000 UTC m=+21.051524949" observedRunningTime="2026-04-17 21:36:48.884180105 +0000 UTC m=+22.718068035" watchObservedRunningTime="2026-04-17 21:36:48.884358888 +0000 UTC m=+22.718246816" Apr 17 21:36:49.740378 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:49.740285 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:49.740531 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:49.740426 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:49.877929 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:49.877889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" event={"ID":"78090d91-9d9e-4fdc-ae8c-1ad4175f499c","Type":"ContainerStarted","Data":"a73f23152f99e8ed5f10e639af7342312f3b988e1008d8203fa716fe28c1d937"} Apr 17 21:36:49.897474 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:49.897412 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xdhm6" podStartSLOduration=3.178907712 podStartE2EDuration="22.897395315s" podCreationTimestamp="2026-04-17 21:36:27 +0000 UTC" firstStartedPulling="2026-04-17 21:36:29.671805166 +0000 UTC m=+3.505693075" lastFinishedPulling="2026-04-17 21:36:49.390292769 +0000 UTC m=+23.224180678" observedRunningTime="2026-04-17 21:36:49.896023166 +0000 UTC m=+23.729911096" watchObservedRunningTime="2026-04-17 21:36:49.897395315 +0000 UTC m=+23.731283241" Apr 17 21:36:50.022941 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:50.022862 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:50.023670 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:50.023652 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:50.740439 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:50.740408 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:50.740634 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:50.740408 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:50.740634 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:50.740532 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkkmx" podUID="f3915b71-644a-48b2-a22a-a629db33eec4" Apr 17 21:36:50.740634 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:50.740591 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:50.883544 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:50.883493 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" event={"ID":"096937b2-0789-4eb0-a35d-1a44c37d72dd","Type":"ContainerStarted","Data":"6fe3ad6ce5086116016eec798f30eaffccbd263e57b367c4a3dfb542b84c789e"} Apr 17 21:36:50.884183 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:50.883993 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:50.884449 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:50.884423 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bhgcd" Apr 17 21:36:51.665286 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:51.665239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:51.665471 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:51.665398 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:51.665471 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:51.665465 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret podName:f3915b71-644a-48b2-a22a-a629db33eec4 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:59.665448477 +0000 UTC m=+33.499336387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret") pod "global-pull-secret-syncer-rkkmx" (UID: "f3915b71-644a-48b2-a22a-a629db33eec4") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:51.740999 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:51.740965 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:51.741183 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:51.741110 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:52.740620 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:52.740365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:52.741096 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:52.740365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:52.741096 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:52.740652 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkkmx" podUID="f3915b71-644a-48b2-a22a-a629db33eec4" Apr 17 21:36:52.741096 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:52.740736 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:52.889562 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:52.889516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" event={"ID":"096937b2-0789-4eb0-a35d-1a44c37d72dd","Type":"ContainerStarted","Data":"ab824d3ecbc7313cfb3dda29995ca3149ebea8602ad7c96974162835c258191e"} Apr 17 21:36:52.889927 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:52.889896 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:52.889927 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:52.889931 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:52.891306 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:52.891281 2576 generic.go:358] "Generic (PLEG): container finished" podID="996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e" containerID="663c2a417ab8d1aa8fb36ff5ead400276fad550f40f763f760367e1fdb706213" exitCode=0 Apr 17 21:36:52.891431 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:52.891369 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" event={"ID":"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e","Type":"ContainerDied","Data":"663c2a417ab8d1aa8fb36ff5ead400276fad550f40f763f760367e1fdb706213"} Apr 17 21:36:52.905363 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:52.905338 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:52.914054 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:52.914013 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" podStartSLOduration=9.018649981 podStartE2EDuration="26.914001314s" podCreationTimestamp="2026-04-17 21:36:26 +0000 UTC" firstStartedPulling="2026-04-17 21:36:29.668782945 +0000 UTC m=+3.502670859" lastFinishedPulling="2026-04-17 21:36:47.564134283 +0000 UTC m=+21.398022192" observedRunningTime="2026-04-17 21:36:52.913599012 +0000 UTC m=+26.747486962" watchObservedRunningTime="2026-04-17 21:36:52.914001314 +0000 UTC m=+26.747889242" Apr 17 21:36:53.740941 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:53.740781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:53.741225 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:53.741028 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:53.896022 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:53.895945 2576 generic.go:358] "Generic (PLEG): container finished" podID="996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e" containerID="650535af43b6bb7f600f7bad801ee70141efe7e9709e4bd2d792120babb54623" exitCode=0 Apr 17 21:36:53.896161 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:53.896020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" event={"ID":"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e","Type":"ContainerDied","Data":"650535af43b6bb7f600f7bad801ee70141efe7e9709e4bd2d792120babb54623"} Apr 17 21:36:53.896591 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:53.896567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:53.912777 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:53.912754 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:36:54.237210 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:54.237138 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jlszn"] Apr 17 21:36:54.237375 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:54.237260 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:54.237375 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:54.237336 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:54.240170 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:54.240145 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lncjj"] Apr 17 21:36:54.240298 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:54.240255 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:54.240355 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:54.240339 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:54.240732 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:54.240710 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rkkmx"] Apr 17 21:36:54.240831 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:54.240818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:54.240923 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:54.240902 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkkmx" podUID="f3915b71-644a-48b2-a22a-a629db33eec4" Apr 17 21:36:54.899815 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:54.899783 2576 generic.go:358] "Generic (PLEG): container finished" podID="996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e" containerID="10c6401071347f10a6bb93cabed41899c8bb37e7e2828737dda04e30043a8126" exitCode=0 Apr 17 21:36:54.900418 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:54.899878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" event={"ID":"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e","Type":"ContainerDied","Data":"10c6401071347f10a6bb93cabed41899c8bb37e7e2828737dda04e30043a8126"} Apr 17 21:36:55.741118 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:55.741088 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:55.741311 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:55.741086 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:55.741311 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:55.741212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:55.741311 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:55.741210 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:55.741469 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:55.741321 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkkmx" podUID="f3915b71-644a-48b2-a22a-a629db33eec4" Apr 17 21:36:55.741469 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:55.741433 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:57.740814 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:57.740780 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:57.741628 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:57.740776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:57.741628 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:57.740936 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:57.741628 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:57.740777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:57.741628 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:57.740997 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkkmx" podUID="f3915b71-644a-48b2-a22a-a629db33eec4" Apr 17 21:36:57.741628 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:57.741041 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:59.728425 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:59.728374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:59.728800 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:59.728523 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:59.728800 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:59.728589 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret podName:f3915b71-644a-48b2-a22a-a629db33eec4 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:15.728573118 +0000 UTC m=+49.562461023 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret") pod "global-pull-secret-syncer-rkkmx" (UID: "f3915b71-644a-48b2-a22a-a629db33eec4") : object "kube-system"/"original-pull-secret" not registered Apr 17 21:36:59.740617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:59.740587 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:36:59.740739 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:59.740587 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:36:59.740739 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:59.740688 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lncjj" podUID="697e918d-013d-41df-9440-059bd3d99a19" Apr 17 21:36:59.740739 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:59.740697 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:36:59.740858 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:59.740769 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jlszn" podUID="9a699952-32f8-4727-bb72-f047e0297d2f" Apr 17 21:36:59.740895 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:36:59.740847 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rkkmx" podUID="f3915b71-644a-48b2-a22a-a629db33eec4" Apr 17 21:36:59.993980 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:59.993885 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-47.ec2.internal" event="NodeReady" Apr 17 21:36:59.994180 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:36:59.994042 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 21:37:00.025103 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.025058 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-h57tx"] Apr 17 21:37:00.048762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.048493 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm"] Apr 17 21:37:00.048762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.048757 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.051264 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.051238 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 21:37:00.051561 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.051539 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 21:37:00.051767 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.051751 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.051840 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.051771 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.052014 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.051987 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-j8tfj\"" Apr 17 21:37:00.065421 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.065394 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5477664494-rrxhm"] Apr 17 21:37:00.068093 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.068042 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:00.072057 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.072039 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 21:37:00.072267 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.072134 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 21:37:00.072656 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.072370 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-lx55l\"" Apr 17 21:37:00.072656 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.072556 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.072931 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.072909 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.090386 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.090362 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr"] Apr 17 21:37:00.090529 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.090498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.093154 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.093131 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 21:37:00.093418 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.093398 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 21:37:00.093558 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.093496 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lmbh2\"" Apr 17 21:37:00.093622 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.093576 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 21:37:00.099751 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.099654 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 21:37:00.108251 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.108230 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-625wr"] Apr 17 21:37:00.108412 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.108395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr" Apr 17 21:37:00.111015 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.110975 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.111143 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.111039 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-zrvlb\"" Apr 17 21:37:00.111202 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.110976 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.128030 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.128005 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v"] Apr 17 21:37:00.128199 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.128183 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.130803 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.130704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:00.130803 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.130761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-config\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.131010 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.130815 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-trusted-ca\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.131010 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.130820 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 21:37:00.131010 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.130842 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-dkjz2\"" Apr 17 21:37:00.131010 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.130858 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.131010 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.130863 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-serving-cert\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.131010 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.130898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdm7\" (UniqueName: \"kubernetes.io/projected/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-kube-api-access-9zdm7\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.131010 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.130932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwk4s\" (UniqueName: \"kubernetes.io/projected/3bab4632-8085-482e-acfd-ff3769ca3407-kube-api-access-hwk4s\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:00.131010 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.130819 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.131607 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.131151 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 21:37:00.139547 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.139516 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 21:37:00.144529 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.144504 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-h57tx"] Apr 17 21:37:00.144529 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.144532 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7646cfc968-5h87d"] Apr 17 21:37:00.144722 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.144682 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:00.147507 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.147475 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.147507 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.147477 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 21:37:00.148963 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.148941 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.149092 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.148941 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-7cjxn\"" Apr 17 21:37:00.149092 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.148944 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 21:37:00.160226 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.160202 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5"] Apr 17 21:37:00.160364 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.160346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.163215 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.163188 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 21:37:00.163359 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.163300 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 21:37:00.163434 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.163421 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.163629 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.163610 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 21:37:00.163726 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.163621 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 21:37:00.163825 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.163810 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-dk2tl\"" Apr 17 21:37:00.164001 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.163985 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.175556 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.175527 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk"] Apr 17 21:37:00.175704 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.175574 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.178245 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.178001 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 21:37:00.178245 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.178130 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.178245 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.178187 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.178529 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.178318 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 21:37:00.178529 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.178432 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-x8sfk\"" Apr 17 21:37:00.196387 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.196359 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm"] Apr 17 21:37:00.196387 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.196387 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6"] Apr 17 21:37:00.196575 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.196514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.199578 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.199550 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 21:37:00.199676 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.199591 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.199676 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.199611 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.199676 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.199664 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 21:37:00.199912 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.199888 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-l77vn\"" Apr 17 21:37:00.211620 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.211595 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf"] Apr 17 21:37:00.211740 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.211729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6" Apr 17 21:37:00.214359 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.214336 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.214675 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.214408 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.214675 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.214345 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-z85kj\"" Apr 17 21:37:00.229416 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.229395 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn"] Apr 17 21:37:00.229558 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.229539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" Apr 17 21:37:00.231902 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.231856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-bound-sa-token\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.231902 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.231902 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-serving-cert\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.232117 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.231929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.232117 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.231990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9pq4\" (UniqueName: \"kubernetes.io/projected/fa074a8e-430b-4109-87d9-1949bf0c1d86-kube-api-access-f9pq4\") pod \"volume-data-source-validator-7c6cbb6c87-lqwtr\" (UID: \"fa074a8e-430b-4109-87d9-1949bf0c1d86\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr" Apr 17 21:37:00.232117 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/538ad283-f697-4eb8-b901-99763f2b9340-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:00.232117 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:00.232321 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232145 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.232321 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.232194 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:37:00.232321 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232200 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-4d2kd\"" Apr 17 21:37:00.232321 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232213 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.232321 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232221 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 21:37:00.232321 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.232260 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls podName:3bab4632-8085-482e-acfd-ff3769ca3407 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:00.732240376 +0000 UTC m=+34.566128281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2blqm" (UID: "3bab4632-8085-482e-acfd-ff3769ca3407") : secret "samples-operator-tls" not found Apr 17 21:37:00.232321 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8drgc\" (UniqueName: \"kubernetes.io/projected/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-kube-api-access-8drgc\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.232660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-image-registry-private-configuration\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.232660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232308 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 21:37:00.232660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxntf\" (UniqueName: \"kubernetes.io/projected/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-kube-api-access-sxntf\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.232660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:00.232660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdm7\" (UniqueName: \"kubernetes.io/projected/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-kube-api-access-9zdm7\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.232660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.232660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwk4s\" (UniqueName: \"kubernetes.io/projected/3bab4632-8085-482e-acfd-ff3769ca3407-kube-api-access-hwk4s\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:00.232660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-trusted-ca\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.232660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-trusted-ca\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.232660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232660 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-installation-pull-secrets\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgg4t\" (UniqueName: \"kubernetes.io/projected/538ad283-f697-4eb8-b901-99763f2b9340-kube-api-access-vgg4t\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-config\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-registry-certificates\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-tmp\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-snapshots\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.232907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-serving-cert\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.233014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-default-certificate\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.233044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-stats-auth\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.233070 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51814314-02b7-436f-83f8-4acfbdf378b7-ca-trust-extracted\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.233165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clms7\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-kube-api-access-clms7\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.233200 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.233194 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.233710 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.233220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-service-ca-bundle\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.233710 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.233606 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-config\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.235104 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.234543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-trusted-ca\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.238040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.238016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-serving-cert\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.241519 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.241494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdm7\" (UniqueName: \"kubernetes.io/projected/152480c2-ecf4-4eab-a6b2-3f71ca86e6c0-kube-api-access-9zdm7\") pod \"console-operator-9d4b6777b-h57tx\" (UID: \"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0\") " pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.241519 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.241513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwk4s\" (UniqueName: \"kubernetes.io/projected/3bab4632-8085-482e-acfd-ff3769ca3407-kube-api-access-hwk4s\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:00.247607 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.247527 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6"] Apr 17 21:37:00.247727 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.247656 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.250373 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.250352 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 21:37:00.265738 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.265711 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr"] Apr 17 21:37:00.265738 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.265741 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v"] Apr 17 21:37:00.265934 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.265755 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-625wr"] Apr 17 21:37:00.265934 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.265766 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5477664494-rrxhm"] Apr 17 21:37:00.265934 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.265781 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8wmk5"] Apr 17 21:37:00.265934 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.265884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.268462 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.268430 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 21:37:00.268617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.268475 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 21:37:00.268617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.268513 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 21:37:00.268803 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.268784 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 21:37:00.283203 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.283180 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7646cfc968-5h87d"] Apr 17 21:37:00.283203 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.283208 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5"] Apr 17 21:37:00.283356 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.283221 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6"] Apr 17 21:37:00.283356 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.283234 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf"] Apr 17 21:37:00.283356 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.283246 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk"] Apr 17 21:37:00.283356 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.283256 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8wmk5"] Apr 17 21:37:00.283356 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.283263 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn"] Apr 17 21:37:00.283356 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.283272 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6"] Apr 17 21:37:00.283356 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.283294 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lcgfs"] Apr 17 21:37:00.284160 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.283747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:00.286199 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.286177 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 21:37:00.286445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.286429 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 21:37:00.286445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.286443 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 21:37:00.286567 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.286451 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m9tzd\"" Apr 17 21:37:00.303188 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.303164 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lcgfs"] Apr 17 21:37:00.303359 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.303341 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.306194 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.306174 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 21:37:00.306316 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.306207 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wjbk2\"" Apr 17 21:37:00.306316 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.306295 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 21:37:00.334303 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03daabdf-1d60-48ca-aaef-1d5e74ad468c-tmp\") pod \"klusterlet-addon-workmgr-5466ff547-b9gcn\" (UID: \"03daabdf-1d60-48ca-aaef-1d5e74ad468c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.334478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-trusted-ca\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.334478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.334478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334373 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-installation-pull-secrets\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.334478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgg4t\" (UniqueName: \"kubernetes.io/projected/538ad283-f697-4eb8-b901-99763f2b9340-kube-api-access-vgg4t\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:00.334478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.334478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334451 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ee90e9e-74f6-4a93-b52b-9e82c5789ae2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwfp5\" (UID: \"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.334784 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee63485a-d5c2-456b-998c-2c74fec67448-config\") pod \"service-ca-operator-d6fc45fc5-rn4mk\" (UID: \"ee63485a-d5c2-456b-998c-2c74fec67448\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.334784 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.334508 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:00.834485291 +0000 UTC m=+34.668373217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : configmap references non-existent config key: service-ca.crt Apr 17 21:37:00.334784 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-registry-certificates\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.334784 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-tmp\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.334784 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-snapshots\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.334784 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pzr9\" (UniqueName: \"kubernetes.io/projected/05cac4af-b010-4bb6-ba12-8d55e0fedc36-kube-api-access-8pzr9\") pod \"network-check-source-8894fc9bd-kgrg6\" (UID: \"05cac4af-b010-4bb6-ba12-8d55e0fedc36\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6" Apr 17 21:37:00.334784 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhp6b\" (UniqueName: \"kubernetes.io/projected/0876a7c3-96c2-4faa-8b75-ec7acb2d05b9-kube-api-access-vhp6b\") pod \"managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf\" (UID: \"0876a7c3-96c2-4faa-8b75-ec7acb2d05b9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" Apr 17 21:37:00.334784 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-default-certificate\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.334784 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-ca\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-stats-auth\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51814314-02b7-436f-83f8-4acfbdf378b7-ca-trust-extracted\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-hub\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clms7\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-kube-api-access-clms7\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.334976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-service-ca-bundle\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs4p8\" (UniqueName: \"kubernetes.io/projected/ee63485a-d5c2-456b-998c-2c74fec67448-kube-api-access-hs4p8\") pod \"service-ca-operator-d6fc45fc5-rn4mk\" (UID: \"ee63485a-d5c2-456b-998c-2c74fec67448\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-bound-sa-token\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-serving-cert\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-tmp\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzdvl\" (UniqueName: \"kubernetes.io/projected/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-kube-api-access-lzdvl\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-registry-certificates\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.335247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9pq4\" (UniqueName: \"kubernetes.io/projected/fa074a8e-430b-4109-87d9-1949bf0c1d86-kube-api-access-f9pq4\") pod \"volume-data-source-validator-7c6cbb6c87-lqwtr\" (UID: \"fa074a8e-430b-4109-87d9-1949bf0c1d86\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/538ad283-f697-4eb8-b901-99763f2b9340-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee63485a-d5c2-456b-998c-2c74fec67448-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rn4mk\" (UID: \"ee63485a-d5c2-456b-998c-2c74fec67448\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335313 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0876a7c3-96c2-4faa-8b75-ec7acb2d05b9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf\" (UID: \"0876a7c3-96c2-4faa-8b75-ec7acb2d05b9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-trusted-ca\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8drgc\" (UniqueName: \"kubernetes.io/projected/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-kube-api-access-8drgc\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-snapshots\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-service-ca-bundle\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.335900 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.335915 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5477664494-rrxhm: secret "image-registry-tls" not found Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.335948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.335969 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls podName:51814314-02b7-436f-83f8-4acfbdf378b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:00.835952949 +0000 UTC m=+34.669840868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls") pod "image-registry-5477664494-rrxhm" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7") : secret "image-registry-tls" not found Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.336020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-image-registry-private-configuration\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.336054 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/03daabdf-1d60-48ca-aaef-1d5e74ad468c-klusterlet-config\") pod \"klusterlet-addon-workmgr-5466ff547-b9gcn\" (UID: \"03daabdf-1d60-48ca-aaef-1d5e74ad468c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.336122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxntf\" (UniqueName: \"kubernetes.io/projected/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-kube-api-access-sxntf\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.336309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.336153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mr7x\" (UniqueName: \"kubernetes.io/projected/03daabdf-1d60-48ca-aaef-1d5e74ad468c-kube-api-access-4mr7x\") pod \"klusterlet-addon-workmgr-5466ff547-b9gcn\" (UID: \"03daabdf-1d60-48ca-aaef-1d5e74ad468c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.337664 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.336208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:00.337664 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.336240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8v5l\" (UniqueName: \"kubernetes.io/projected/4ee90e9e-74f6-4a93-b52b-9e82c5789ae2-kube-api-access-n8v5l\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwfp5\" (UID: \"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.337664 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.336395 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:00.337664 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.336445 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls podName:538ad283-f697-4eb8-b901-99763f2b9340 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:00.836430624 +0000 UTC m=+34.670318540 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vmb5v" (UID: "538ad283-f697-4eb8-b901-99763f2b9340") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:00.337664 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.336432 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/538ad283-f697-4eb8-b901-99763f2b9340-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:00.337664 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.336539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee90e9e-74f6-4a93-b52b-9e82c5789ae2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwfp5\" (UID: \"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.337664 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.336584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.337664 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.336711 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:37:00.337664 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.336779 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:00.836763794 +0000 UTC m=+34.670651710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : secret "router-metrics-certs-default" not found Apr 17 21:37:00.337664 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.337647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-installation-pull-secrets\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.338120 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.338061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-serving-cert\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.338399 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.338364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-stats-auth\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.338952 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.338932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-image-registry-private-configuration\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.343356 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.343335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51814314-02b7-436f-83f8-4acfbdf378b7-ca-trust-extracted\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.344398 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.344269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgg4t\" (UniqueName: \"kubernetes.io/projected/538ad283-f697-4eb8-b901-99763f2b9340-kube-api-access-vgg4t\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:00.344496 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.344461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clms7\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-kube-api-access-clms7\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.345315 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.345296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-bound-sa-token\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.346694 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.346669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9pq4\" (UniqueName: \"kubernetes.io/projected/fa074a8e-430b-4109-87d9-1949bf0c1d86-kube-api-access-f9pq4\") pod \"volume-data-source-validator-7c6cbb6c87-lqwtr\" (UID: \"fa074a8e-430b-4109-87d9-1949bf0c1d86\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr" Apr 17 21:37:00.346798 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.346753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-default-certificate\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.347061 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.347043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8drgc\" (UniqueName: \"kubernetes.io/projected/cd17cea4-4ab5-4023-b0a8-ebd4db6056d6-kube-api-access-8drgc\") pod \"insights-operator-585dfdc468-625wr\" (UID: \"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6\") " pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.348707 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.348683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxntf\" (UniqueName: \"kubernetes.io/projected/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-kube-api-access-sxntf\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.373452 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.373423 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:00.418561 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.418520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr" Apr 17 21:37:00.437645 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.437607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.437645 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.437657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ee90e9e-74f6-4a93-b52b-9e82c5789ae2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwfp5\" (UID: \"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.437866 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.437690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee63485a-d5c2-456b-998c-2c74fec67448-config\") pod \"service-ca-operator-d6fc45fc5-rn4mk\" (UID: \"ee63485a-d5c2-456b-998c-2c74fec67448\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.437918 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.437857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pzr9\" (UniqueName: \"kubernetes.io/projected/05cac4af-b010-4bb6-ba12-8d55e0fedc36-kube-api-access-8pzr9\") pod \"network-check-source-8894fc9bd-kgrg6\" (UID: \"05cac4af-b010-4bb6-ba12-8d55e0fedc36\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6" Apr 17 21:37:00.437918 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.437902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhp6b\" (UniqueName: \"kubernetes.io/projected/0876a7c3-96c2-4faa-8b75-ec7acb2d05b9-kube-api-access-vhp6b\") pod \"managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf\" (UID: \"0876a7c3-96c2-4faa-8b75-ec7acb2d05b9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" Apr 17 21:37:00.438026 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.437934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.438026 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.437975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-ca\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.438026 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-hub\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.438209 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs4p8\" (UniqueName: \"kubernetes.io/projected/ee63485a-d5c2-456b-998c-2c74fec67448-kube-api-access-hs4p8\") pod \"service-ca-operator-d6fc45fc5-rn4mk\" (UID: \"ee63485a-d5c2-456b-998c-2c74fec67448\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.438209 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.438209 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzdvl\" (UniqueName: \"kubernetes.io/projected/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-kube-api-access-lzdvl\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.438209 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhqxf\" (UniqueName: \"kubernetes.io/projected/6a2f302e-4951-4581-a1e6-f71a43573912-kube-api-access-lhqxf\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:00.438209 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/da349b59-9fbe-4add-9ba3-8270d5731310-tmp-dir\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.438445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee63485a-d5c2-456b-998c-2c74fec67448-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rn4mk\" (UID: \"ee63485a-d5c2-456b-998c-2c74fec67448\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.438445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0876a7c3-96c2-4faa-8b75-ec7acb2d05b9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf\" (UID: \"0876a7c3-96c2-4faa-8b75-ec7acb2d05b9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" Apr 17 21:37:00.438445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7h4l\" (UniqueName: \"kubernetes.io/projected/da349b59-9fbe-4add-9ba3-8270d5731310-kube-api-access-n7h4l\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.438445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee63485a-d5c2-456b-998c-2c74fec67448-config\") pod \"service-ca-operator-d6fc45fc5-rn4mk\" (UID: \"ee63485a-d5c2-456b-998c-2c74fec67448\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.438445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.438445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ee90e9e-74f6-4a93-b52b-9e82c5789ae2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwfp5\" (UID: \"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.438445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:37:00.438445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/03daabdf-1d60-48ca-aaef-1d5e74ad468c-klusterlet-config\") pod \"klusterlet-addon-workmgr-5466ff547-b9gcn\" (UID: \"03daabdf-1d60-48ca-aaef-1d5e74ad468c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.438445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mr7x\" (UniqueName: \"kubernetes.io/projected/03daabdf-1d60-48ca-aaef-1d5e74ad468c-kube-api-access-4mr7x\") pod \"klusterlet-addon-workmgr-5466ff547-b9gcn\" (UID: \"03daabdf-1d60-48ca-aaef-1d5e74ad468c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.438445 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8v5l\" (UniqueName: \"kubernetes.io/projected/4ee90e9e-74f6-4a93-b52b-9e82c5789ae2-kube-api-access-n8v5l\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwfp5\" (UID: \"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.438911 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.438716 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:37:00.438911 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.438774 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs podName:697e918d-013d-41df-9440-059bd3d99a19 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:32.438759566 +0000 UTC m=+66.272647475 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs") pod "network-metrics-daemon-lncjj" (UID: "697e918d-013d-41df-9440-059bd3d99a19") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:37:00.439021 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.438959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.439153 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.439130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee90e9e-74f6-4a93-b52b-9e82c5789ae2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwfp5\" (UID: \"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.439218 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.439199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03daabdf-1d60-48ca-aaef-1d5e74ad468c-tmp\") pod \"klusterlet-addon-workmgr-5466ff547-b9gcn\" (UID: \"03daabdf-1d60-48ca-aaef-1d5e74ad468c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.439331 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.439234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da349b59-9fbe-4add-9ba3-8270d5731310-config-volume\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.439331 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.439260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:00.439667 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.439643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03daabdf-1d60-48ca-aaef-1d5e74ad468c-tmp\") pod \"klusterlet-addon-workmgr-5466ff547-b9gcn\" (UID: \"03daabdf-1d60-48ca-aaef-1d5e74ad468c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.441684 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.441478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-625wr" Apr 17 21:37:00.441684 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.441486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-ca\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.441684 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.441656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.441684 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.441657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ee90e9e-74f6-4a93-b52b-9e82c5789ae2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwfp5\" (UID: \"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.441917 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.441862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.442251 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.442229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-hub\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.442378 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.442338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee63485a-d5c2-456b-998c-2c74fec67448-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rn4mk\" (UID: \"ee63485a-d5c2-456b-998c-2c74fec67448\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.442643 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.442627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/0876a7c3-96c2-4faa-8b75-ec7acb2d05b9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf\" (UID: \"0876a7c3-96c2-4faa-8b75-ec7acb2d05b9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" Apr 17 21:37:00.442927 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.442903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/03daabdf-1d60-48ca-aaef-1d5e74ad468c-klusterlet-config\") pod \"klusterlet-addon-workmgr-5466ff547-b9gcn\" (UID: \"03daabdf-1d60-48ca-aaef-1d5e74ad468c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.446862 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.446839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhp6b\" (UniqueName: \"kubernetes.io/projected/0876a7c3-96c2-4faa-8b75-ec7acb2d05b9-kube-api-access-vhp6b\") pod \"managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf\" (UID: \"0876a7c3-96c2-4faa-8b75-ec7acb2d05b9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" Apr 17 21:37:00.447045 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.447018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzdvl\" (UniqueName: \"kubernetes.io/projected/1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0-kube-api-access-lzdvl\") pod \"cluster-proxy-proxy-agent-57b565c99f-5gdx6\" (UID: \"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.450265 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.450245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8v5l\" (UniqueName: \"kubernetes.io/projected/4ee90e9e-74f6-4a93-b52b-9e82c5789ae2-kube-api-access-n8v5l\") pod \"kube-storage-version-migrator-operator-6769c5d45-qwfp5\" (UID: \"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.450367 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.450342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pzr9\" (UniqueName: \"kubernetes.io/projected/05cac4af-b010-4bb6-ba12-8d55e0fedc36-kube-api-access-8pzr9\") pod \"network-check-source-8894fc9bd-kgrg6\" (UID: \"05cac4af-b010-4bb6-ba12-8d55e0fedc36\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6" Apr 17 21:37:00.450560 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.450540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mr7x\" (UniqueName: \"kubernetes.io/projected/03daabdf-1d60-48ca-aaef-1d5e74ad468c-kube-api-access-4mr7x\") pod \"klusterlet-addon-workmgr-5466ff547-b9gcn\" (UID: \"03daabdf-1d60-48ca-aaef-1d5e74ad468c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.450620 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.450583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs4p8\" (UniqueName: \"kubernetes.io/projected/ee63485a-d5c2-456b-998c-2c74fec67448-kube-api-access-hs4p8\") pod \"service-ca-operator-d6fc45fc5-rn4mk\" (UID: \"ee63485a-d5c2-456b-998c-2c74fec67448\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.485862 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.485823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" Apr 17 21:37:00.507703 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.507621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" Apr 17 21:37:00.521426 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.521389 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6" Apr 17 21:37:00.540534 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.540502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxfgz\" (UniqueName: \"kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz\") pod \"network-check-target-jlszn\" (UID: \"9a699952-32f8-4727-bb72-f047e0297d2f\") " pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:37:00.540677 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.540570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhqxf\" (UniqueName: \"kubernetes.io/projected/6a2f302e-4951-4581-a1e6-f71a43573912-kube-api-access-lhqxf\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:00.540677 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.540605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/da349b59-9fbe-4add-9ba3-8270d5731310-tmp-dir\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.540677 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.540648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7h4l\" (UniqueName: \"kubernetes.io/projected/da349b59-9fbe-4add-9ba3-8270d5731310-kube-api-access-n7h4l\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.540803 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.540732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da349b59-9fbe-4add-9ba3-8270d5731310-config-volume\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.540803 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.540754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:00.540887 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.540826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.540937 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.540930 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:00.541051 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.540987 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls podName:da349b59-9fbe-4add-9ba3-8270d5731310 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:01.040969232 +0000 UTC m=+34.874857150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls") pod "dns-default-lcgfs" (UID: "da349b59-9fbe-4add-9ba3-8270d5731310") : secret "dns-default-metrics-tls" not found Apr 17 21:37:00.541051 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.540941 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:00.541051 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.541025 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert podName:6a2f302e-4951-4581-a1e6-f71a43573912 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:01.041014746 +0000 UTC m=+34.874902668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert") pod "ingress-canary-8wmk5" (UID: "6a2f302e-4951-4581-a1e6-f71a43573912") : secret "canary-serving-cert" not found Apr 17 21:37:00.541335 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.541317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da349b59-9fbe-4add-9ba3-8270d5731310-config-volume\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.543362 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.543340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxfgz\" (UniqueName: \"kubernetes.io/projected/9a699952-32f8-4727-bb72-f047e0297d2f-kube-api-access-lxfgz\") pod \"network-check-target-jlszn\" (UID: \"9a699952-32f8-4727-bb72-f047e0297d2f\") " pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:37:00.548584 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.548564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7h4l\" (UniqueName: \"kubernetes.io/projected/da349b59-9fbe-4add-9ba3-8270d5731310-kube-api-access-n7h4l\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.548691 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.548633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhqxf\" (UniqueName: \"kubernetes.io/projected/6a2f302e-4951-4581-a1e6-f71a43573912-kube-api-access-lhqxf\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:00.553117 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.553093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/da349b59-9fbe-4add-9ba3-8270d5731310-tmp-dir\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:00.566137 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.566108 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" Apr 17 21:37:00.574387 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.574366 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:00.580021 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.579997 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:37:00.742416 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.742329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:00.742865 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.742493 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:37:00.742865 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.742571 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls podName:3bab4632-8085-482e-acfd-ff3769ca3407 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:01.74255182 +0000 UTC m=+35.576439731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2blqm" (UID: "3bab4632-8085-482e-acfd-ff3769ca3407") : secret "samples-operator-tls" not found Apr 17 21:37:00.843535 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.843484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:00.843698 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.843553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.843698 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.843605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:00.843698 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.843639 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:00.843698 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.843670 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:37:00.843698 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:00.843693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:00.843914 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.843738 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:01.843716429 +0000 UTC m=+35.677604345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : secret "router-metrics-certs-default" not found Apr 17 21:37:00.843914 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.843754 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls podName:538ad283-f697-4eb8-b901-99763f2b9340 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:01.843747696 +0000 UTC m=+35.677635603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vmb5v" (UID: "538ad283-f697-4eb8-b901-99763f2b9340") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:00.843914 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.843802 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:00.843914 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.843815 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5477664494-rrxhm: secret "image-registry-tls" not found Apr 17 21:37:00.843914 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.843829 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:01.843812421 +0000 UTC m=+35.677700333 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : configmap references non-existent config key: service-ca.crt Apr 17 21:37:00.843914 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:00.843868 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls podName:51814314-02b7-436f-83f8-4acfbdf378b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:01.843853067 +0000 UTC m=+35.677740987 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls") pod "image-registry-5477664494-rrxhm" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7") : secret "image-registry-tls" not found Apr 17 21:37:01.045471 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.045441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:01.045660 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.045599 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:01.045660 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.045635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:01.045771 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.045668 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert podName:6a2f302e-4951-4581-a1e6-f71a43573912 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:02.04564877 +0000 UTC m=+35.879536691 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert") pod "ingress-canary-8wmk5" (UID: "6a2f302e-4951-4581-a1e6-f71a43573912") : secret "canary-serving-cert" not found Apr 17 21:37:01.045771 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.045725 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:01.045771 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.045771 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls podName:da349b59-9fbe-4add-9ba3-8270d5731310 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:02.045759224 +0000 UTC m=+35.879647134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls") pod "dns-default-lcgfs" (UID: "da349b59-9fbe-4add-9ba3-8270d5731310") : secret "dns-default-metrics-tls" not found Apr 17 21:37:01.740489 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.740451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:37:01.740684 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.740451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:37:01.740684 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.740452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:37:01.744479 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.744448 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9r26l\"" Apr 17 21:37:01.744858 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.744499 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nlncx\"" Apr 17 21:37:01.744858 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.744499 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 21:37:01.744858 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.744581 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 21:37:01.751905 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.751881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:01.752032 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.752013 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:37:01.752146 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.752131 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls podName:3bab4632-8085-482e-acfd-ff3769ca3407 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:03.752115345 +0000 UTC m=+37.586003255 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2blqm" (UID: "3bab4632-8085-482e-acfd-ff3769ca3407") : secret "samples-operator-tls" not found Apr 17 21:37:01.765170 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.765140 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:37:01.853400 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.853357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:01.853598 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.853451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:01.853598 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.853492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:01.853598 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.853556 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:01.853598 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.853577 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5477664494-rrxhm: secret "image-registry-tls" not found Apr 17 21:37:01.853795 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.853602 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:37:01.853795 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.853609 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:01.853795 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.853639 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls podName:51814314-02b7-436f-83f8-4acfbdf378b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:03.853617832 +0000 UTC m=+37.687505744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls") pod "image-registry-5477664494-rrxhm" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7") : secret "image-registry-tls" not found Apr 17 21:37:01.853795 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.853734 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:03.853721817 +0000 UTC m=+37.687609737 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : secret "router-metrics-certs-default" not found Apr 17 21:37:01.853795 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.853749 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls podName:538ad283-f697-4eb8-b901-99763f2b9340 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:03.853743656 +0000 UTC m=+37.687631562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vmb5v" (UID: "538ad283-f697-4eb8-b901-99763f2b9340") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:01.853795 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:01.853779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:01.854063 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:01.853852 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:03.85384101 +0000 UTC m=+37.687728934 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : configmap references non-existent config key: service-ca.crt Apr 17 21:37:02.055850 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.055802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:02.056010 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.055901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:02.056105 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:02.056042 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:02.056169 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:02.056105 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:02.056169 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:02.056128 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls podName:da349b59-9fbe-4add-9ba3-8270d5731310 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:04.056106982 +0000 UTC m=+37.889994910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls") pod "dns-default-lcgfs" (UID: "da349b59-9fbe-4add-9ba3-8270d5731310") : secret "dns-default-metrics-tls" not found Apr 17 21:37:02.056270 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:02.056172 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert podName:6a2f302e-4951-4581-a1e6-f71a43573912 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:04.056152446 +0000 UTC m=+37.890040373 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert") pod "ingress-canary-8wmk5" (UID: "6a2f302e-4951-4581-a1e6-f71a43573912") : secret "canary-serving-cert" not found Apr 17 21:37:02.307958 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.307703 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5"] Apr 17 21:37:02.309185 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.309132 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr"] Apr 17 21:37:02.317802 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.313866 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6"] Apr 17 21:37:02.317802 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.315641 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf"] Apr 17 21:37:02.327655 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.327223 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn"] Apr 17 21:37:02.336670 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.335573 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6"] Apr 17 21:37:02.337444 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.337391 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk"] Apr 17 21:37:02.338260 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.338184 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-625wr"] Apr 17 21:37:02.341674 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.340173 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jlszn"] Apr 17 21:37:02.342940 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.342910 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-h57tx"] Apr 17 21:37:02.344598 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:37:02.344573 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05cac4af_b010_4bb6_ba12_8d55e0fedc36.slice/crio-800163b1439d1a5def11ea0456af897038263533006cef4a63ec20a2e5d8d0f0 WatchSource:0}: Error finding container 800163b1439d1a5def11ea0456af897038263533006cef4a63ec20a2e5d8d0f0: Status 404 returned error can't find the container with id 800163b1439d1a5def11ea0456af897038263533006cef4a63ec20a2e5d8d0f0 Apr 17 21:37:02.351463 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:37:02.351400 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a699952_32f8_4727_bb72_f047e0297d2f.slice/crio-a90e51a56dbe2a8469bd2d7d6de552f8905d091221d1d8f6cbb5a34c66d15939 WatchSource:0}: Error finding container a90e51a56dbe2a8469bd2d7d6de552f8905d091221d1d8f6cbb5a34c66d15939: Status 404 returned error can't find the container with id a90e51a56dbe2a8469bd2d7d6de552f8905d091221d1d8f6cbb5a34c66d15939 Apr 17 21:37:02.352594 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:37:02.352569 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod152480c2_ecf4_4eab_a6b2_3f71ca86e6c0.slice/crio-2c2737b0b2a8b2fafb04a70a1383a56ecec5fe219c215b3fd70785de0888d0ec WatchSource:0}: Error finding container 2c2737b0b2a8b2fafb04a70a1383a56ecec5fe219c215b3fd70785de0888d0ec: Status 404 returned error can't find the container with id 2c2737b0b2a8b2fafb04a70a1383a56ecec5fe219c215b3fd70785de0888d0ec Apr 17 21:37:02.919938 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.919653 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr" event={"ID":"fa074a8e-430b-4109-87d9-1949bf0c1d86","Type":"ContainerStarted","Data":"002b973d5c0856071f0e2978f5e0bd0ca179bd17287230641dc59161f579fab4"} Apr 17 21:37:02.921020 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.920970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" event={"ID":"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0","Type":"ContainerStarted","Data":"4fbf3b224bb4ce4b31449aa753c92ea4009884b106c28ad5150a4f3d6e2883e5"} Apr 17 21:37:02.922155 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.922123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-625wr" event={"ID":"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6","Type":"ContainerStarted","Data":"4f1b7b54ba2b0edf8233a3accadc796b076a78b304b5fc09bb94ee96f70415c3"} Apr 17 21:37:02.923434 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.923366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6" event={"ID":"05cac4af-b010-4bb6-ba12-8d55e0fedc36","Type":"ContainerStarted","Data":"800163b1439d1a5def11ea0456af897038263533006cef4a63ec20a2e5d8d0f0"} Apr 17 21:37:02.924652 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.924606 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jlszn" event={"ID":"9a699952-32f8-4727-bb72-f047e0297d2f","Type":"ContainerStarted","Data":"a90e51a56dbe2a8469bd2d7d6de552f8905d091221d1d8f6cbb5a34c66d15939"} Apr 17 21:37:02.927096 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.927022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" event={"ID":"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0","Type":"ContainerStarted","Data":"2c2737b0b2a8b2fafb04a70a1383a56ecec5fe219c215b3fd70785de0888d0ec"} Apr 17 21:37:02.928706 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.928682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" event={"ID":"ee63485a-d5c2-456b-998c-2c74fec67448","Type":"ContainerStarted","Data":"52f4bb28d387db3dcec7e909ab55d7af3f8edd75ae28c4df3edce1d48c8934b7"} Apr 17 21:37:02.930455 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.930429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" event={"ID":"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2","Type":"ContainerStarted","Data":"0dd0586be25793fb62e7bd14baaab5202df28f86930c0a6975c04a47fb3e2bb9"} Apr 17 21:37:02.932128 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.932094 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" event={"ID":"03daabdf-1d60-48ca-aaef-1d5e74ad468c","Type":"ContainerStarted","Data":"389038bec676a242df83f55a2e7290279b7459fb834ff631c3bc4ebcbad1c1a2"} Apr 17 21:37:02.936542 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.936516 2576 generic.go:358] "Generic (PLEG): container finished" podID="996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e" containerID="bccfe6e8c472a6c51f0d717d3d0f7a93f48369300ab0e74785ac44afd72105f7" exitCode=0 Apr 17 21:37:02.936655 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.936596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" event={"ID":"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e","Type":"ContainerDied","Data":"bccfe6e8c472a6c51f0d717d3d0f7a93f48369300ab0e74785ac44afd72105f7"} Apr 17 21:37:02.941701 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:02.941675 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" event={"ID":"0876a7c3-96c2-4faa-8b75-ec7acb2d05b9","Type":"ContainerStarted","Data":"da0eee0832b6faa9b56f2fd07220185bc0908f8a8fed1e9b22cc085541bbd97a"} Apr 17 21:37:03.776251 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:03.776213 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:03.776603 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:03.776584 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:37:03.776694 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:03.776657 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls podName:3bab4632-8085-482e-acfd-ff3769ca3407 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:07.776637435 +0000 UTC m=+41.610525362 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2blqm" (UID: "3bab4632-8085-482e-acfd-ff3769ca3407") : secret "samples-operator-tls" not found Apr 17 21:37:03.877066 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:03.877024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:03.877242 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:03.877153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:03.877242 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:03.877190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:03.877242 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:03.877234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:03.877587 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:03.877567 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:07.877545936 +0000 UTC m=+41.711433842 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : configmap references non-existent config key: service-ca.crt Apr 17 21:37:03.878111 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:03.878068 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:03.878111 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:03.878107 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5477664494-rrxhm: secret "image-registry-tls" not found Apr 17 21:37:03.878249 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:03.878157 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls podName:51814314-02b7-436f-83f8-4acfbdf378b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:07.878139931 +0000 UTC m=+41.712027840 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls") pod "image-registry-5477664494-rrxhm" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7") : secret "image-registry-tls" not found Apr 17 21:37:03.878699 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:03.878682 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:37:03.878785 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:03.878734 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:07.878720643 +0000 UTC m=+41.712608553 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : secret "router-metrics-certs-default" not found Apr 17 21:37:03.879178 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:03.879039 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:03.879178 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:03.879109 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls podName:538ad283-f697-4eb8-b901-99763f2b9340 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:07.879094941 +0000 UTC m=+41.712982850 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vmb5v" (UID: "538ad283-f697-4eb8-b901-99763f2b9340") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:03.961046 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:03.960136 2576 generic.go:358] "Generic (PLEG): container finished" podID="996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e" containerID="203fbeb18ae0e25d71111ff0a44de6175eac2ad82a4c1e83fdbc0f0824408765" exitCode=0 Apr 17 21:37:03.961046 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:03.960213 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" event={"ID":"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e","Type":"ContainerDied","Data":"203fbeb18ae0e25d71111ff0a44de6175eac2ad82a4c1e83fdbc0f0824408765"} Apr 17 21:37:04.079521 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:04.079475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:04.079694 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:04.079600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:04.080181 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:04.080157 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:04.080289 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:04.080228 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls podName:da349b59-9fbe-4add-9ba3-8270d5731310 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:08.080207874 +0000 UTC m=+41.914095797 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls") pod "dns-default-lcgfs" (UID: "da349b59-9fbe-4add-9ba3-8270d5731310") : secret "dns-default-metrics-tls" not found Apr 17 21:37:04.080412 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:04.080396 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:04.080471 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:04.080444 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert podName:6a2f302e-4951-4581-a1e6-f71a43573912 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:08.080431121 +0000 UTC m=+41.914319032 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert") pod "ingress-canary-8wmk5" (UID: "6a2f302e-4951-4581-a1e6-f71a43573912") : secret "canary-serving-cert" not found Apr 17 21:37:04.982106 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:04.981859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" event={"ID":"996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e","Type":"ContainerStarted","Data":"6ccefb333210d701bdc41b46213de346babeab6ad9f14493180ec2652af8ec61"} Apr 17 21:37:05.008909 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:05.008852 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b5rqj" podStartSLOduration=5.322890382 podStartE2EDuration="38.008832646s" podCreationTimestamp="2026-04-17 21:36:27 +0000 UTC" firstStartedPulling="2026-04-17 21:36:29.667041662 +0000 UTC m=+3.500929569" lastFinishedPulling="2026-04-17 21:37:02.352983911 +0000 UTC m=+36.186871833" observedRunningTime="2026-04-17 21:37:05.005930789 +0000 UTC m=+38.839818742" watchObservedRunningTime="2026-04-17 21:37:05.008832646 +0000 UTC m=+38.842720578" Apr 17 21:37:07.820374 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:07.820332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:07.820800 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:07.820500 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:37:07.820800 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:07.820589 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls podName:3bab4632-8085-482e-acfd-ff3769ca3407 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:15.820565828 +0000 UTC m=+49.654453754 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2blqm" (UID: "3bab4632-8085-482e-acfd-ff3769ca3407") : secret "samples-operator-tls" not found Apr 17 21:37:07.920916 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:07.920875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:07.921122 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:07.920986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:07.921122 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:07.921024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:07.921122 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:07.921054 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:07.921122 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:07.921086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:07.921122 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:07.921093 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5477664494-rrxhm: secret "image-registry-tls" not found Apr 17 21:37:07.921400 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:07.921173 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:07.921400 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:07.921182 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls podName:51814314-02b7-436f-83f8-4acfbdf378b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:15.921158731 +0000 UTC m=+49.755046640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls") pod "image-registry-5477664494-rrxhm" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7") : secret "image-registry-tls" not found Apr 17 21:37:07.921400 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:07.921248 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:37:07.921400 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:07.921254 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:15.921237773 +0000 UTC m=+49.755125683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : configmap references non-existent config key: service-ca.crt Apr 17 21:37:07.921400 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:07.921337 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls podName:538ad283-f697-4eb8-b901-99763f2b9340 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:15.921320193 +0000 UTC m=+49.755208101 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vmb5v" (UID: "538ad283-f697-4eb8-b901-99763f2b9340") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:07.921400 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:07.921362 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:15.921350603 +0000 UTC m=+49.755238516 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : secret "router-metrics-certs-default" not found Apr 17 21:37:08.123173 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:08.123129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:08.123380 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:08.123235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:08.123380 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:08.123317 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:08.123380 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:08.123345 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:08.123557 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:08.123402 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert podName:6a2f302e-4951-4581-a1e6-f71a43573912 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:16.123380118 +0000 UTC m=+49.957268038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert") pod "ingress-canary-8wmk5" (UID: "6a2f302e-4951-4581-a1e6-f71a43573912") : secret "canary-serving-cert" not found Apr 17 21:37:08.123557 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:08.123422 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls podName:da349b59-9fbe-4add-9ba3-8270d5731310 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:16.123412435 +0000 UTC m=+49.957300343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls") pod "dns-default-lcgfs" (UID: "da349b59-9fbe-4add-9ba3-8270d5731310") : secret "dns-default-metrics-tls" not found Apr 17 21:37:14.003151 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.003045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" event={"ID":"0876a7c3-96c2-4faa-8b75-ec7acb2d05b9","Type":"ContainerStarted","Data":"5a104b165941baab2f3fad5202543e19f5b316eb41c7da42ffc763bd24f01a7d"} Apr 17 21:37:14.004567 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.004533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr" event={"ID":"fa074a8e-430b-4109-87d9-1949bf0c1d86","Type":"ContainerStarted","Data":"b0d9405722398ead369b00b2cb6e05b45c71b579c5a6fef078567475f473e04f"} Apr 17 21:37:14.006338 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.006315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" event={"ID":"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0","Type":"ContainerStarted","Data":"76bc1998f68773fa95a464ec22a0d3d4b29aa4dddd3539f28d188737a57eed04"} Apr 17 21:37:14.007758 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.007729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-625wr" event={"ID":"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6","Type":"ContainerStarted","Data":"a90a097220ff8fa7e75fb7f2f52784f39e136dab390601068279996f95a4f128"} Apr 17 21:37:14.009212 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.009191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6" event={"ID":"05cac4af-b010-4bb6-ba12-8d55e0fedc36","Type":"ContainerStarted","Data":"9a93bb5b94b19e2a6a3e8a48db2f6d3a8ff60c52bba1c203cb570d80aa4f7f5d"} Apr 17 21:37:14.010552 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.010519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jlszn" event={"ID":"9a699952-32f8-4727-bb72-f047e0297d2f","Type":"ContainerStarted","Data":"e672c0f94ae2536737a29ae3d42326379e0162bfca3eb58e97bc97653679db89"} Apr 17 21:37:14.010704 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.010655 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:37:14.012171 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.012151 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/0.log" Apr 17 21:37:14.012257 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.012192 2576 generic.go:358] "Generic (PLEG): container finished" podID="152480c2-ecf4-4eab-a6b2-3f71ca86e6c0" containerID="95fbd56dc73c0d0953d770dbb3c637b216dd530c7e68aac706d2403b24050d99" exitCode=255 Apr 17 21:37:14.012308 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.012256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" event={"ID":"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0","Type":"ContainerDied","Data":"95fbd56dc73c0d0953d770dbb3c637b216dd530c7e68aac706d2403b24050d99"} Apr 17 21:37:14.012503 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.012467 2576 scope.go:117] "RemoveContainer" containerID="95fbd56dc73c0d0953d770dbb3c637b216dd530c7e68aac706d2403b24050d99" Apr 17 21:37:14.013734 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.013565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" event={"ID":"ee63485a-d5c2-456b-998c-2c74fec67448","Type":"ContainerStarted","Data":"fb9f7723263016cac7ad2c7c80ebe23c7cf21d59d042d5c4cb5cf6f98bff28b1"} Apr 17 21:37:14.015006 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.014983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" event={"ID":"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2","Type":"ContainerStarted","Data":"1eeea76aa779a720380e91cbe02c73bb287ceb595e4b5eafcad9b41766f8b654"} Apr 17 21:37:14.018547 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.018500 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-66bfb9c7dc-xmfmf" podStartSLOduration=34.172102124 podStartE2EDuration="45.018485088s" podCreationTimestamp="2026-04-17 21:36:29 +0000 UTC" firstStartedPulling="2026-04-17 21:37:02.333367851 +0000 UTC m=+36.167255760" lastFinishedPulling="2026-04-17 21:37:13.179750818 +0000 UTC m=+47.013638724" observedRunningTime="2026-04-17 21:37:14.016642601 +0000 UTC m=+47.850530531" watchObservedRunningTime="2026-04-17 21:37:14.018485088 +0000 UTC m=+47.852373017" Apr 17 21:37:14.035353 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.035313 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lqwtr" podStartSLOduration=16.475322026 podStartE2EDuration="27.035299428s" podCreationTimestamp="2026-04-17 21:36:47 +0000 UTC" firstStartedPulling="2026-04-17 21:37:02.327595541 +0000 UTC m=+36.161483454" lastFinishedPulling="2026-04-17 21:37:12.887572939 +0000 UTC m=+46.721460856" observedRunningTime="2026-04-17 21:37:14.0347972 +0000 UTC m=+47.868685130" watchObservedRunningTime="2026-04-17 21:37:14.035299428 +0000 UTC m=+47.869187356" Apr 17 21:37:14.052144 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.052097 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" podStartSLOduration=13.219727323 podStartE2EDuration="24.052065832s" podCreationTimestamp="2026-04-17 21:36:50 +0000 UTC" firstStartedPulling="2026-04-17 21:37:02.340467677 +0000 UTC m=+36.174355583" lastFinishedPulling="2026-04-17 21:37:13.172806179 +0000 UTC m=+47.006694092" observedRunningTime="2026-04-17 21:37:14.051280231 +0000 UTC m=+47.885168160" watchObservedRunningTime="2026-04-17 21:37:14.052065832 +0000 UTC m=+47.885953764" Apr 17 21:37:14.068548 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.068495 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-kgrg6" podStartSLOduration=12.241642022 podStartE2EDuration="23.068479632s" podCreationTimestamp="2026-04-17 21:36:51 +0000 UTC" firstStartedPulling="2026-04-17 21:37:02.346123103 +0000 UTC m=+36.180011022" lastFinishedPulling="2026-04-17 21:37:13.172960722 +0000 UTC m=+47.006848632" observedRunningTime="2026-04-17 21:37:14.068283232 +0000 UTC m=+47.902171162" watchObservedRunningTime="2026-04-17 21:37:14.068479632 +0000 UTC m=+47.902367562" Apr 17 21:37:14.089603 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.089556 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-625wr" podStartSLOduration=17.254756365 podStartE2EDuration="28.089538494s" podCreationTimestamp="2026-04-17 21:36:46 +0000 UTC" firstStartedPulling="2026-04-17 21:37:02.337247453 +0000 UTC m=+36.171135358" lastFinishedPulling="2026-04-17 21:37:13.172029567 +0000 UTC m=+47.005917487" observedRunningTime="2026-04-17 21:37:14.086954056 +0000 UTC m=+47.920841986" watchObservedRunningTime="2026-04-17 21:37:14.089538494 +0000 UTC m=+47.923426419" Apr 17 21:37:14.141145 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.139128 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" podStartSLOduration=15.293705038 podStartE2EDuration="26.13910735s" podCreationTimestamp="2026-04-17 21:36:48 +0000 UTC" firstStartedPulling="2026-04-17 21:37:02.325326987 +0000 UTC m=+36.159214895" lastFinishedPulling="2026-04-17 21:37:13.170729289 +0000 UTC m=+47.004617207" observedRunningTime="2026-04-17 21:37:14.108954881 +0000 UTC m=+47.942842810" watchObservedRunningTime="2026-04-17 21:37:14.13910735 +0000 UTC m=+47.972995279" Apr 17 21:37:14.141145 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.139478 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jlszn" podStartSLOduration=37.318859142 podStartE2EDuration="48.139470252s" podCreationTimestamp="2026-04-17 21:36:26 +0000 UTC" firstStartedPulling="2026-04-17 21:37:02.353767599 +0000 UTC m=+36.187655515" lastFinishedPulling="2026-04-17 21:37:13.174378704 +0000 UTC m=+47.008266625" observedRunningTime="2026-04-17 21:37:14.134163559 +0000 UTC m=+47.968051488" watchObservedRunningTime="2026-04-17 21:37:14.139470252 +0000 UTC m=+47.973358182" Apr 17 21:37:14.916452 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.916416 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs"] Apr 17 21:37:14.939139 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.938326 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs"] Apr 17 21:37:14.939139 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.938479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs" Apr 17 21:37:14.941653 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.941232 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 21:37:14.942714 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.942350 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-jrfwd\"" Apr 17 21:37:14.942714 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.942543 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 21:37:14.988044 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:14.987967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gc5t\" (UniqueName: \"kubernetes.io/projected/2f23fadd-370c-4137-9a4f-d3ee5b830a79-kube-api-access-8gc5t\") pod \"migrator-74bb7799d9-wl4bs\" (UID: \"2f23fadd-370c-4137-9a4f-d3ee5b830a79\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs" Apr 17 21:37:15.022189 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.022155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:37:15.022621 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.022602 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/0.log" Apr 17 21:37:15.022672 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.022643 2576 generic.go:358] "Generic (PLEG): container finished" podID="152480c2-ecf4-4eab-a6b2-3f71ca86e6c0" containerID="b4e4d622dbf2906c3631960c6f00927af116ac365265b84c770499095d70d1ce" exitCode=255 Apr 17 21:37:15.022941 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.022912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" event={"ID":"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0","Type":"ContainerDied","Data":"b4e4d622dbf2906c3631960c6f00927af116ac365265b84c770499095d70d1ce"} Apr 17 21:37:15.022976 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.022964 2576 scope.go:117] "RemoveContainer" containerID="95fbd56dc73c0d0953d770dbb3c637b216dd530c7e68aac706d2403b24050d99" Apr 17 21:37:15.023800 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.023707 2576 scope.go:117] "RemoveContainer" containerID="b4e4d622dbf2906c3631960c6f00927af116ac365265b84c770499095d70d1ce" Apr 17 21:37:15.023945 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.023879 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-h57tx_openshift-console-operator(152480c2-ecf4-4eab-a6b2-3f71ca86e6c0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" podUID="152480c2-ecf4-4eab-a6b2-3f71ca86e6c0" Apr 17 21:37:15.090052 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.089642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gc5t\" (UniqueName: \"kubernetes.io/projected/2f23fadd-370c-4137-9a4f-d3ee5b830a79-kube-api-access-8gc5t\") pod \"migrator-74bb7799d9-wl4bs\" (UID: \"2f23fadd-370c-4137-9a4f-d3ee5b830a79\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs" Apr 17 21:37:15.102950 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.102916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gc5t\" (UniqueName: \"kubernetes.io/projected/2f23fadd-370c-4137-9a4f-d3ee5b830a79-kube-api-access-8gc5t\") pod \"migrator-74bb7799d9-wl4bs\" (UID: \"2f23fadd-370c-4137-9a4f-d3ee5b830a79\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs" Apr 17 21:37:15.262139 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.262033 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs" Apr 17 21:37:15.393722 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.393692 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs"] Apr 17 21:37:15.396333 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:37:15.396303 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f23fadd_370c_4137_9a4f_d3ee5b830a79.slice/crio-7af49aa3eb1e3ac24188e77e1b70283ca817d181b3c63b56aa3a1ce46bacdd9d WatchSource:0}: Error finding container 7af49aa3eb1e3ac24188e77e1b70283ca817d181b3c63b56aa3a1ce46bacdd9d: Status 404 returned error can't find the container with id 7af49aa3eb1e3ac24188e77e1b70283ca817d181b3c63b56aa3a1ce46bacdd9d Apr 17 21:37:15.796488 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.796445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:37:15.799454 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.799397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f3915b71-644a-48b2-a22a-a629db33eec4-original-pull-secret\") pod \"global-pull-secret-syncer-rkkmx\" (UID: \"f3915b71-644a-48b2-a22a-a629db33eec4\") " pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:37:15.814642 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.814616 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gjnvl_3d6a476e-116f-4845-8146-b9bc2af9d504/dns-node-resolver/0.log" Apr 17 21:37:15.852869 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.852833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rkkmx" Apr 17 21:37:15.897745 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.897704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:15.897916 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.897885 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 21:37:15.897989 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.897961 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls podName:3bab4632-8085-482e-acfd-ff3769ca3407 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:31.897939733 +0000 UTC m=+65.731827643 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-2blqm" (UID: "3bab4632-8085-482e-acfd-ff3769ca3407") : secret "samples-operator-tls" not found Apr 17 21:37:15.998224 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.998181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:15.998416 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.998243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:15.998416 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.998293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:15.998416 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.998341 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:15.998416 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:15.998376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:15.998416 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.998382 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 21:37:15.998657 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.998426 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls podName:538ad283-f697-4eb8-b901-99763f2b9340 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:31.998403701 +0000 UTC m=+65.832291621 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vmb5v" (UID: "538ad283-f697-4eb8-b901-99763f2b9340") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:15.998657 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.998477 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:15.998657 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.998485 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:31.998466744 +0000 UTC m=+65.832354655 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : configmap references non-existent config key: service-ca.crt Apr 17 21:37:15.998657 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.998490 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5477664494-rrxhm: secret "image-registry-tls" not found Apr 17 21:37:15.998657 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.998531 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls podName:51814314-02b7-436f-83f8-4acfbdf378b7 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:31.998518005 +0000 UTC m=+65.832405914 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls") pod "image-registry-5477664494-rrxhm" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7") : secret "image-registry-tls" not found Apr 17 21:37:15.998657 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:15.998548 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:31.998538332 +0000 UTC m=+65.832426246 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : secret "router-metrics-certs-default" not found Apr 17 21:37:16.028515 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.028460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs" event={"ID":"2f23fadd-370c-4137-9a4f-d3ee5b830a79","Type":"ContainerStarted","Data":"7af49aa3eb1e3ac24188e77e1b70283ca817d181b3c63b56aa3a1ce46bacdd9d"} Apr 17 21:37:16.030123 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.030098 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:37:16.030540 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.030516 2576 scope.go:117] "RemoveContainer" containerID="b4e4d622dbf2906c3631960c6f00927af116ac365265b84c770499095d70d1ce" Apr 17 21:37:16.030781 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:16.030758 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-h57tx_openshift-console-operator(152480c2-ecf4-4eab-a6b2-3f71ca86e6c0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" podUID="152480c2-ecf4-4eab-a6b2-3f71ca86e6c0" Apr 17 21:37:16.200797 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.200738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:16.200962 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.200844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:16.200962 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:16.200866 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:16.200962 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:16.200948 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert podName:6a2f302e-4951-4581-a1e6-f71a43573912 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:32.200926977 +0000 UTC m=+66.034814903 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert") pod "ingress-canary-8wmk5" (UID: "6a2f302e-4951-4581-a1e6-f71a43573912") : secret "canary-serving-cert" not found Apr 17 21:37:16.201237 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:16.201005 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:16.201237 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:16.201059 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls podName:da349b59-9fbe-4add-9ba3-8270d5731310 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:32.201043495 +0000 UTC m=+66.034931426 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls") pod "dns-default-lcgfs" (UID: "da349b59-9fbe-4add-9ba3-8270d5731310") : secret "dns-default-metrics-tls" not found Apr 17 21:37:16.235733 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.235696 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rkkmx"] Apr 17 21:37:16.239609 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:37:16.239580 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3915b71_644a_48b2_a22a_a629db33eec4.slice/crio-9083e5df2558fc476016bc657c5077101a94cf55ddb315365c925dbfac6451a1 WatchSource:0}: Error finding container 9083e5df2558fc476016bc657c5077101a94cf55ddb315365c925dbfac6451a1: Status 404 returned error can't find the container with id 9083e5df2558fc476016bc657c5077101a94cf55ddb315365c925dbfac6451a1 Apr 17 21:37:16.623650 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.623624 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4jhjv_02310a72-3db5-42e8-b257-0ccf87bb8deb/node-ca/0.log" Apr 17 21:37:16.873361 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.873309 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-stw9p"] Apr 17 21:37:16.894744 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.894699 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-stw9p"] Apr 17 21:37:16.894967 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.894843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:16.897563 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.897535 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 21:37:16.897827 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.897808 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-z6th7\"" Apr 17 21:37:16.898057 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:16.898035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 21:37:17.007806 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.007770 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4a70ef53-ac44-46a4-aa6d-728b21d6154e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.008137 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.007962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.008137 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.008029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4a70ef53-ac44-46a4-aa6d-728b21d6154e-crio-socket\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.008137 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.008059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rcrj\" (UniqueName: \"kubernetes.io/projected/4a70ef53-ac44-46a4-aa6d-728b21d6154e-kube-api-access-9rcrj\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.008343 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.008138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4a70ef53-ac44-46a4-aa6d-728b21d6154e-data-volume\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.036467 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.036424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rkkmx" event={"ID":"f3915b71-644a-48b2-a22a-a629db33eec4","Type":"ContainerStarted","Data":"9083e5df2558fc476016bc657c5077101a94cf55ddb315365c925dbfac6451a1"} Apr 17 21:37:17.038649 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.038616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" event={"ID":"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0","Type":"ContainerStarted","Data":"a3a04de7738a7cbc037e9304da23a4b289a2f437f74c7bf349101961222a5856"} Apr 17 21:37:17.038778 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.038658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" event={"ID":"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0","Type":"ContainerStarted","Data":"fad6d6d42978d00bc319eaa7d5b44f7d2c07bd1eb508a3e8ff699a75300d2555"} Apr 17 21:37:17.057750 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.057614 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" podStartSLOduration=33.845082071 podStartE2EDuration="48.057598016s" podCreationTimestamp="2026-04-17 21:36:29 +0000 UTC" firstStartedPulling="2026-04-17 21:37:02.323945646 +0000 UTC m=+36.157833552" lastFinishedPulling="2026-04-17 21:37:16.536461586 +0000 UTC m=+50.370349497" observedRunningTime="2026-04-17 21:37:17.057037789 +0000 UTC m=+50.890925721" watchObservedRunningTime="2026-04-17 21:37:17.057598016 +0000 UTC m=+50.891485943" Apr 17 21:37:17.110007 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.109445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4a70ef53-ac44-46a4-aa6d-728b21d6154e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.110007 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.109603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.110007 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.109671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4a70ef53-ac44-46a4-aa6d-728b21d6154e-crio-socket\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.110007 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.109699 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rcrj\" (UniqueName: \"kubernetes.io/projected/4a70ef53-ac44-46a4-aa6d-728b21d6154e-kube-api-access-9rcrj\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.110007 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.109779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4a70ef53-ac44-46a4-aa6d-728b21d6154e-data-volume\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.110007 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:17.109797 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 21:37:17.110007 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:17.109853 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls podName:4a70ef53-ac44-46a4-aa6d-728b21d6154e nodeName:}" failed. No retries permitted until 2026-04-17 21:37:17.609834476 +0000 UTC m=+51.443722386 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-stw9p" (UID: "4a70ef53-ac44-46a4-aa6d-728b21d6154e") : secret "insights-runtime-extractor-tls" not found Apr 17 21:37:17.110007 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.110011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4a70ef53-ac44-46a4-aa6d-728b21d6154e-crio-socket\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.110482 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.110025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4a70ef53-ac44-46a4-aa6d-728b21d6154e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.110482 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.110254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4a70ef53-ac44-46a4-aa6d-728b21d6154e-data-volume\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.122218 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.122186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rcrj\" (UniqueName: \"kubernetes.io/projected/4a70ef53-ac44-46a4-aa6d-728b21d6154e-kube-api-access-9rcrj\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.615119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:17.615068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:17.615372 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:17.615197 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 21:37:17.615372 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:17.615276 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls podName:4a70ef53-ac44-46a4-aa6d-728b21d6154e nodeName:}" failed. No retries permitted until 2026-04-17 21:37:18.615260718 +0000 UTC m=+52.449148628 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-stw9p" (UID: "4a70ef53-ac44-46a4-aa6d-728b21d6154e") : secret "insights-runtime-extractor-tls" not found Apr 17 21:37:18.626033 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:18.626003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:18.626652 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:18.626221 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 21:37:18.626652 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:18.626294 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls podName:4a70ef53-ac44-46a4-aa6d-728b21d6154e nodeName:}" failed. No retries permitted until 2026-04-17 21:37:20.626273559 +0000 UTC m=+54.460161479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-stw9p" (UID: "4a70ef53-ac44-46a4-aa6d-728b21d6154e") : secret "insights-runtime-extractor-tls" not found Apr 17 21:37:19.049761 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:19.049719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" event={"ID":"03daabdf-1d60-48ca-aaef-1d5e74ad468c","Type":"ContainerStarted","Data":"768b6e4f59f8300c1d683b95e93b04bb0cd7a6172117eb0016be25e646fa0c10"} Apr 17 21:37:19.050245 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:19.050218 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:19.051584 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:19.051556 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs" event={"ID":"2f23fadd-370c-4137-9a4f-d3ee5b830a79","Type":"ContainerStarted","Data":"f6e4b5f020eeabd1926bb6d165758987f1c89616ceae209a32ccb2fe4c12d783"} Apr 17 21:37:19.051690 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:19.051591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs" event={"ID":"2f23fadd-370c-4137-9a4f-d3ee5b830a79","Type":"ContainerStarted","Data":"14f836f2dad750272e7d726f3e17e04fa34ac1c63eced45cf033ed8aa87b2fcd"} Apr 17 21:37:19.052349 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:19.052324 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" Apr 17 21:37:19.066607 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:19.066554 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5466ff547-b9gcn" podStartSLOduration=33.937403112 podStartE2EDuration="50.066540119s" podCreationTimestamp="2026-04-17 21:36:29 +0000 UTC" firstStartedPulling="2026-04-17 21:37:02.335755132 +0000 UTC m=+36.169643040" lastFinishedPulling="2026-04-17 21:37:18.464892127 +0000 UTC m=+52.298780047" observedRunningTime="2026-04-17 21:37:19.066123582 +0000 UTC m=+52.900011510" watchObservedRunningTime="2026-04-17 21:37:19.066540119 +0000 UTC m=+52.900428048" Apr 17 21:37:19.080713 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:19.080665 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wl4bs" podStartSLOduration=2.023708173 podStartE2EDuration="5.080651949s" podCreationTimestamp="2026-04-17 21:37:14 +0000 UTC" firstStartedPulling="2026-04-17 21:37:15.398567936 +0000 UTC m=+49.232455842" lastFinishedPulling="2026-04-17 21:37:18.455511703 +0000 UTC m=+52.289399618" observedRunningTime="2026-04-17 21:37:19.080132177 +0000 UTC m=+52.914020106" watchObservedRunningTime="2026-04-17 21:37:19.080651949 +0000 UTC m=+52.914539878" Apr 17 21:37:20.374574 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:20.374536 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:20.374574 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:20.374579 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:20.375069 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:20.375033 2576 scope.go:117] "RemoveContainer" containerID="b4e4d622dbf2906c3631960c6f00927af116ac365265b84c770499095d70d1ce" Apr 17 21:37:20.375264 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:20.375245 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-h57tx_openshift-console-operator(152480c2-ecf4-4eab-a6b2-3f71ca86e6c0)\"" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" podUID="152480c2-ecf4-4eab-a6b2-3f71ca86e6c0" Apr 17 21:37:20.645694 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:20.645603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:20.645840 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:20.645780 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 21:37:20.645903 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:20.645854 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls podName:4a70ef53-ac44-46a4-aa6d-728b21d6154e nodeName:}" failed. No retries permitted until 2026-04-17 21:37:24.645832556 +0000 UTC m=+58.479720467 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-stw9p" (UID: "4a70ef53-ac44-46a4-aa6d-728b21d6154e") : secret "insights-runtime-extractor-tls" not found Apr 17 21:37:21.059299 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:21.059259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rkkmx" event={"ID":"f3915b71-644a-48b2-a22a-a629db33eec4","Type":"ContainerStarted","Data":"39ce3e463e2480b491427ad1a931c65306ffb344628cd2f0d038a613297438db"} Apr 17 21:37:21.074736 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:21.074676 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rkkmx" podStartSLOduration=33.439219117 podStartE2EDuration="38.074654359s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:37:16.24161214 +0000 UTC m=+50.075500056" lastFinishedPulling="2026-04-17 21:37:20.877047388 +0000 UTC m=+54.710935298" observedRunningTime="2026-04-17 21:37:21.073413334 +0000 UTC m=+54.907301275" watchObservedRunningTime="2026-04-17 21:37:21.074654359 +0000 UTC m=+54.908542292" Apr 17 21:37:24.681130 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:24.681071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:24.681613 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:24.681237 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 21:37:24.681613 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:24.681307 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls podName:4a70ef53-ac44-46a4-aa6d-728b21d6154e nodeName:}" failed. No retries permitted until 2026-04-17 21:37:32.681289683 +0000 UTC m=+66.515177589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls") pod "insights-runtime-extractor-stw9p" (UID: "4a70ef53-ac44-46a4-aa6d-728b21d6154e") : secret "insights-runtime-extractor-tls" not found Apr 17 21:37:25.915796 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:25.915767 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bhdzs" Apr 17 21:37:31.945365 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:31.945323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:31.947854 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:31.947829 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bab4632-8085-482e-acfd-ff3769ca3407-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-2blqm\" (UID: \"3bab4632-8085-482e-acfd-ff3769ca3407\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:32.045876 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.045842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:32.046038 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.045905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:37:32.046038 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.045936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:32.046038 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.045968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:32.046207 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:32.046065 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle podName:26507fb0-1d13-41d7-b18c-dfbaf034b8e0 nodeName:}" failed. No retries permitted until 2026-04-17 21:38:04.046052065 +0000 UTC m=+97.879939971 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle") pod "router-default-7646cfc968-5h87d" (UID: "26507fb0-1d13-41d7-b18c-dfbaf034b8e0") : configmap references non-existent config key: service-ca.crt Apr 17 21:37:32.046207 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:32.046064 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:32.046207 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:32.046156 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls podName:538ad283-f697-4eb8-b901-99763f2b9340 nodeName:}" failed. No retries permitted until 2026-04-17 21:38:04.046139417 +0000 UTC m=+97.880027325 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vmb5v" (UID: "538ad283-f697-4eb8-b901-99763f2b9340") : secret "cluster-monitoring-operator-tls" not found Apr 17 21:37:32.048429 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.048401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-metrics-certs\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:37:32.048535 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.048401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") pod \"image-registry-5477664494-rrxhm\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:32.185604 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.185572 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-lx55l\"" Apr 17 21:37:32.193024 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.193003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" Apr 17 21:37:32.203558 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.203520 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lmbh2\"" Apr 17 21:37:32.211670 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.211648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:32.248831 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.248793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:32.248980 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.248898 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:32.251582 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.251546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6a2f302e-4951-4581-a1e6-f71a43573912-cert\") pod \"ingress-canary-8wmk5\" (UID: \"6a2f302e-4951-4581-a1e6-f71a43573912\") " pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:32.253187 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.252414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/da349b59-9fbe-4add-9ba3-8270d5731310-metrics-tls\") pod \"dns-default-lcgfs\" (UID: \"da349b59-9fbe-4add-9ba3-8270d5731310\") " pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:32.323196 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.323162 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm"] Apr 17 21:37:32.344956 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.344921 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5477664494-rrxhm"] Apr 17 21:37:32.348308 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:37:32.348284 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51814314_02b7_436f_83f8_4acfbdf378b7.slice/crio-387ca3133ac6abe3361d49860fbef8689573b5f046dcccf0673b983e8b9d6675 WatchSource:0}: Error finding container 387ca3133ac6abe3361d49860fbef8689573b5f046dcccf0673b983e8b9d6675: Status 404 returned error can't find the container with id 387ca3133ac6abe3361d49860fbef8689573b5f046dcccf0673b983e8b9d6675 Apr 17 21:37:32.404615 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.404591 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m9tzd\"" Apr 17 21:37:32.412793 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.412769 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8wmk5" Apr 17 21:37:32.415826 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.415624 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wjbk2\"" Apr 17 21:37:32.424230 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.424204 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:32.450273 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.450237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:37:32.453608 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.453534 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 21:37:32.461703 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:32.461052 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 21:37:32.461703 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:37:32.461172 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs podName:697e918d-013d-41df-9440-059bd3d99a19 nodeName:}" failed. No retries permitted until 2026-04-17 21:38:36.461146009 +0000 UTC m=+130.295033930 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs") pod "network-metrics-daemon-lncjj" (UID: "697e918d-013d-41df-9440-059bd3d99a19") : secret "metrics-daemon-secret" not found Apr 17 21:37:32.550264 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.550211 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8wmk5"] Apr 17 21:37:32.552333 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:37:32.552304 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a2f302e_4951_4581_a1e6_f71a43573912.slice/crio-a405bc0fb29872d1b50b02270ad5bbee7435a1831fa5499917d1284d54d60db3 WatchSource:0}: Error finding container a405bc0fb29872d1b50b02270ad5bbee7435a1831fa5499917d1284d54d60db3: Status 404 returned error can't find the container with id a405bc0fb29872d1b50b02270ad5bbee7435a1831fa5499917d1284d54d60db3 Apr 17 21:37:32.565822 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.565787 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lcgfs"] Apr 17 21:37:32.568614 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:37:32.568588 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda349b59_9fbe_4add_9ba3_8270d5731310.slice/crio-9e89da54bccb5f6754a8ffe5722f637c174f1a8f9453b363b3a54d840d92b71f WatchSource:0}: Error finding container 9e89da54bccb5f6754a8ffe5722f637c174f1a8f9453b363b3a54d840d92b71f: Status 404 returned error can't find the container with id 9e89da54bccb5f6754a8ffe5722f637c174f1a8f9453b363b3a54d840d92b71f Apr 17 21:37:32.740904 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.740831 2576 scope.go:117] "RemoveContainer" containerID="b4e4d622dbf2906c3631960c6f00927af116ac365265b84c770499095d70d1ce" Apr 17 21:37:32.752723 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.752698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:32.755182 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.755151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4a70ef53-ac44-46a4-aa6d-728b21d6154e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-stw9p\" (UID: \"4a70ef53-ac44-46a4-aa6d-728b21d6154e\") " pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:32.809714 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.809689 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-z6th7\"" Apr 17 21:37:32.817328 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.817305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-stw9p" Apr 17 21:37:32.950804 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:32.950759 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-stw9p"] Apr 17 21:37:33.095243 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.095190 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8wmk5" event={"ID":"6a2f302e-4951-4581-a1e6-f71a43573912","Type":"ContainerStarted","Data":"a405bc0fb29872d1b50b02270ad5bbee7435a1831fa5499917d1284d54d60db3"} Apr 17 21:37:33.098791 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.098688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5477664494-rrxhm" event={"ID":"51814314-02b7-436f-83f8-4acfbdf378b7","Type":"ContainerStarted","Data":"2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229"} Apr 17 21:37:33.098791 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.098729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5477664494-rrxhm" event={"ID":"51814314-02b7-436f-83f8-4acfbdf378b7","Type":"ContainerStarted","Data":"387ca3133ac6abe3361d49860fbef8689573b5f046dcccf0673b983e8b9d6675"} Apr 17 21:37:33.099188 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.099150 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:33.101261 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.101232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" event={"ID":"3bab4632-8085-482e-acfd-ff3769ca3407","Type":"ContainerStarted","Data":"6458160fef362dc0d21ac921a5861a8008d50cf756dfdc43f751fda3e85e7e48"} Apr 17 21:37:33.106049 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.105164 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:37:33.106049 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.105243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" event={"ID":"152480c2-ecf4-4eab-a6b2-3f71ca86e6c0","Type":"ContainerStarted","Data":"ce71a6934857623c55b0f265b1c3f1e05990ebb191e454068bb22d87f89811f7"} Apr 17 21:37:33.106049 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.105981 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:33.109866 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.109840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lcgfs" event={"ID":"da349b59-9fbe-4add-9ba3-8270d5731310","Type":"ContainerStarted","Data":"9e89da54bccb5f6754a8ffe5722f637c174f1a8f9453b363b3a54d840d92b71f"} Apr 17 21:37:33.112888 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.112827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-stw9p" event={"ID":"4a70ef53-ac44-46a4-aa6d-728b21d6154e","Type":"ContainerStarted","Data":"19d8528ee073872772150f51a93a8fb613c4a3a4d4c6358245a37b8d6a38344e"} Apr 17 21:37:33.112888 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.112859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-stw9p" event={"ID":"4a70ef53-ac44-46a4-aa6d-728b21d6154e","Type":"ContainerStarted","Data":"09e452136cf4855346814aa778ce8cde3063f9c0c72960d2dfd0ae32dfc6a72e"} Apr 17 21:37:33.119934 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.119582 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5477664494-rrxhm" podStartSLOduration=66.119564594 podStartE2EDuration="1m6.119564594s" podCreationTimestamp="2026-04-17 21:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:37:33.119284754 +0000 UTC m=+66.953172684" watchObservedRunningTime="2026-04-17 21:37:33.119564594 +0000 UTC m=+66.953452524" Apr 17 21:37:33.138034 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.137497 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" podStartSLOduration=36.319511315 podStartE2EDuration="47.137478209s" podCreationTimestamp="2026-04-17 21:36:46 +0000 UTC" firstStartedPulling="2026-04-17 21:37:02.354493402 +0000 UTC m=+36.188381308" lastFinishedPulling="2026-04-17 21:37:13.172460296 +0000 UTC m=+47.006348202" observedRunningTime="2026-04-17 21:37:33.137187019 +0000 UTC m=+66.971074958" watchObservedRunningTime="2026-04-17 21:37:33.137478209 +0000 UTC m=+66.971366140" Apr 17 21:37:33.899307 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:33.899278 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-h57tx" Apr 17 21:37:36.125656 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:36.125616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8wmk5" event={"ID":"6a2f302e-4951-4581-a1e6-f71a43573912","Type":"ContainerStarted","Data":"49ec9483d3f1841c26ac12b734d8b7af2e4f744595ca4a356b64c8658cc2aa79"} Apr 17 21:37:36.127438 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:36.127405 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" event={"ID":"3bab4632-8085-482e-acfd-ff3769ca3407","Type":"ContainerStarted","Data":"adfa58ee0e6e4d6d7d1a6a23a282188174b7c46c200ee55b19d74137d680a9a1"} Apr 17 21:37:36.127589 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:36.127446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" event={"ID":"3bab4632-8085-482e-acfd-ff3769ca3407","Type":"ContainerStarted","Data":"9414f4d669d72d7235f1b272798b5898be1bb2f3d6dfbbd577e0391abb4602af"} Apr 17 21:37:36.129265 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:36.129207 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lcgfs" event={"ID":"da349b59-9fbe-4add-9ba3-8270d5731310","Type":"ContainerStarted","Data":"6712c588d5703e7142b924921313f3663c039d984880351955b0bb53ff61eb8a"} Apr 17 21:37:36.129265 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:36.129243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lcgfs" event={"ID":"da349b59-9fbe-4add-9ba3-8270d5731310","Type":"ContainerStarted","Data":"025210dae1f8dda7caf3425dafd4e7d4ce48ddbb75327fd63488f3e5ef9c697a"} Apr 17 21:37:36.129409 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:36.129326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:36.130904 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:36.130883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-stw9p" event={"ID":"4a70ef53-ac44-46a4-aa6d-728b21d6154e","Type":"ContainerStarted","Data":"b999e317c789afc9b648c51acd03d86f2ad8a7ec0206393fdefc86bed6bc554e"} Apr 17 21:37:36.140258 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:36.140206 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8wmk5" podStartSLOduration=33.331897799 podStartE2EDuration="36.140194853s" podCreationTimestamp="2026-04-17 21:37:00 +0000 UTC" firstStartedPulling="2026-04-17 21:37:32.554461887 +0000 UTC m=+66.388349792" lastFinishedPulling="2026-04-17 21:37:35.362758933 +0000 UTC m=+69.196646846" observedRunningTime="2026-04-17 21:37:36.138724637 +0000 UTC m=+69.972612566" watchObservedRunningTime="2026-04-17 21:37:36.140194853 +0000 UTC m=+69.974082778" Apr 17 21:37:36.154342 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:36.154284 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lcgfs" podStartSLOduration=33.365130926 podStartE2EDuration="36.154268804s" podCreationTimestamp="2026-04-17 21:37:00 +0000 UTC" firstStartedPulling="2026-04-17 21:37:32.570501104 +0000 UTC m=+66.404389010" lastFinishedPulling="2026-04-17 21:37:35.359638965 +0000 UTC m=+69.193526888" observedRunningTime="2026-04-17 21:37:36.153357425 +0000 UTC m=+69.987245358" watchObservedRunningTime="2026-04-17 21:37:36.154268804 +0000 UTC m=+69.988156732" Apr 17 21:37:36.168059 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:36.167998 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-2blqm" podStartSLOduration=47.197479492 podStartE2EDuration="50.167966815s" podCreationTimestamp="2026-04-17 21:36:46 +0000 UTC" firstStartedPulling="2026-04-17 21:37:32.394016954 +0000 UTC m=+66.227904863" lastFinishedPulling="2026-04-17 21:37:35.364504276 +0000 UTC m=+69.198392186" observedRunningTime="2026-04-17 21:37:36.167005766 +0000 UTC m=+70.000893694" watchObservedRunningTime="2026-04-17 21:37:36.167966815 +0000 UTC m=+70.001854743" Apr 17 21:37:37.140639 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:37.140597 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-stw9p" event={"ID":"4a70ef53-ac44-46a4-aa6d-728b21d6154e","Type":"ContainerStarted","Data":"90e9347b75ff285e549f1b3c5c317e1e2bebe31e8e00dc441ea722c5fe7f9668"} Apr 17 21:37:37.160177 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:37.160122 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-stw9p" podStartSLOduration=17.505218634 podStartE2EDuration="21.160107367s" podCreationTimestamp="2026-04-17 21:37:16 +0000 UTC" firstStartedPulling="2026-04-17 21:37:33.096301375 +0000 UTC m=+66.930189294" lastFinishedPulling="2026-04-17 21:37:36.751190111 +0000 UTC m=+70.585078027" observedRunningTime="2026-04-17 21:37:37.158388702 +0000 UTC m=+70.992276630" watchObservedRunningTime="2026-04-17 21:37:37.160107367 +0000 UTC m=+70.993995295" Apr 17 21:37:45.025170 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:45.025137 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jlszn" Apr 17 21:37:46.142606 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:46.142475 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lcgfs" Apr 17 21:37:52.216095 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:52.216046 2576 patch_prober.go:28] interesting pod/image-registry-5477664494-rrxhm container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 21:37:52.216513 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:52.216135 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5477664494-rrxhm" podUID="51814314-02b7-436f-83f8-4acfbdf378b7" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 21:37:54.121330 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:54.121122 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:37:59.978332 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:37:59.978295 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5477664494-rrxhm"] Apr 17 21:38:04.123272 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:04.123233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:38:04.123272 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:04.123279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:38:04.123844 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:04.123825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26507fb0-1d13-41d7-b18c-dfbaf034b8e0-service-ca-bundle\") pod \"router-default-7646cfc968-5h87d\" (UID: \"26507fb0-1d13-41d7-b18c-dfbaf034b8e0\") " pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:38:04.126424 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:04.126395 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/538ad283-f697-4eb8-b901-99763f2b9340-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vmb5v\" (UID: \"538ad283-f697-4eb8-b901-99763f2b9340\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:38:04.357143 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:04.357102 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-7cjxn\"" Apr 17 21:38:04.365425 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:04.365389 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" Apr 17 21:38:04.374013 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:04.373804 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-dk2tl\"" Apr 17 21:38:04.381972 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:04.381905 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:38:04.496242 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:04.496203 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v"] Apr 17 21:38:04.499515 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:38:04.499488 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod538ad283_f697_4eb8_b901_99763f2b9340.slice/crio-f5813533f7212bee2fca7f1b9784093d86734800b13747ffc5bf88a47b2cd994 WatchSource:0}: Error finding container f5813533f7212bee2fca7f1b9784093d86734800b13747ffc5bf88a47b2cd994: Status 404 returned error can't find the container with id f5813533f7212bee2fca7f1b9784093d86734800b13747ffc5bf88a47b2cd994 Apr 17 21:38:04.524531 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:04.524502 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7646cfc968-5h87d"] Apr 17 21:38:04.527870 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:38:04.527843 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26507fb0_1d13_41d7_b18c_dfbaf034b8e0.slice/crio-5a47a305384c539c887d25d23a20f483f29e0b7692407147c6361aafa7609f3e WatchSource:0}: Error finding container 5a47a305384c539c887d25d23a20f483f29e0b7692407147c6361aafa7609f3e: Status 404 returned error can't find the container with id 5a47a305384c539c887d25d23a20f483f29e0b7692407147c6361aafa7609f3e Apr 17 21:38:05.220695 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:05.220656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7646cfc968-5h87d" event={"ID":"26507fb0-1d13-41d7-b18c-dfbaf034b8e0","Type":"ContainerStarted","Data":"e3624e35dfc0de566036bbc88cbdf2b968685a462ee7bec13384b0eca93ffacf"} Apr 17 21:38:05.220695 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:05.220692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7646cfc968-5h87d" event={"ID":"26507fb0-1d13-41d7-b18c-dfbaf034b8e0","Type":"ContainerStarted","Data":"5a47a305384c539c887d25d23a20f483f29e0b7692407147c6361aafa7609f3e"} Apr 17 21:38:05.221837 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:05.221813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" event={"ID":"538ad283-f697-4eb8-b901-99763f2b9340","Type":"ContainerStarted","Data":"f5813533f7212bee2fca7f1b9784093d86734800b13747ffc5bf88a47b2cd994"} Apr 17 21:38:05.240104 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:05.239466 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7646cfc968-5h87d" podStartSLOduration=78.239447142 podStartE2EDuration="1m18.239447142s" podCreationTimestamp="2026-04-17 21:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:38:05.236977656 +0000 UTC m=+99.070865596" watchObservedRunningTime="2026-04-17 21:38:05.239447142 +0000 UTC m=+99.073335072" Apr 17 21:38:05.382289 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:05.382248 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:38:05.384701 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:05.384673 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:38:06.225065 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:06.225023 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:38:06.226430 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:06.226405 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7646cfc968-5h87d" Apr 17 21:38:07.228377 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:07.228338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" event={"ID":"538ad283-f697-4eb8-b901-99763f2b9340","Type":"ContainerStarted","Data":"4b1afeb111134eefc4414e9ba716c32c63fabece0e1219c5a8b0468eadef3635"} Apr 17 21:38:07.244281 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:07.244236 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vmb5v" podStartSLOduration=78.408966874 podStartE2EDuration="1m20.24422256s" podCreationTimestamp="2026-04-17 21:36:47 +0000 UTC" firstStartedPulling="2026-04-17 21:38:04.501398097 +0000 UTC m=+98.335286003" lastFinishedPulling="2026-04-17 21:38:06.336653768 +0000 UTC m=+100.170541689" observedRunningTime="2026-04-17 21:38:07.243307725 +0000 UTC m=+101.077195658" watchObservedRunningTime="2026-04-17 21:38:07.24422256 +0000 UTC m=+101.078110535" Apr 17 21:38:15.245363 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.245325 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-j5tkg"] Apr 17 21:38:15.249000 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.248971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.251598 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.251561 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 21:38:15.251885 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.251860 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 21:38:15.252802 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.252778 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-n69pg\"" Apr 17 21:38:15.252895 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.252793 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 21:38:15.252895 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.252793 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 21:38:15.315729 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.315688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fxn9\" (UniqueName: \"kubernetes.io/projected/1f172aba-e062-4d68-a684-36ccee45d666-kube-api-access-4fxn9\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.315910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.315746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1f172aba-e062-4d68-a684-36ccee45d666-root\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.315910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.315774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-accelerators-collector-config\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.315910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.315805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-textfile\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.315910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.315820 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-wtmp\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.315910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.315876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f172aba-e062-4d68-a684-36ccee45d666-metrics-client-ca\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.316133 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.315940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-tls\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.316133 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.315968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f172aba-e062-4d68-a684-36ccee45d666-sys\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.316133 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.315984 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417247 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-textfile\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417423 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-wtmp\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417423 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f172aba-e062-4d68-a684-36ccee45d666-metrics-client-ca\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417423 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-tls\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417423 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f172aba-e062-4d68-a684-36ccee45d666-sys\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417423 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fxn9\" (UniqueName: \"kubernetes.io/projected/1f172aba-e062-4d68-a684-36ccee45d666-kube-api-access-4fxn9\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-wtmp\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1f172aba-e062-4d68-a684-36ccee45d666-root\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417680 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:38:15.417497 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 21:38:15.417680 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:38:15.417629 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-tls podName:1f172aba-e062-4d68-a684-36ccee45d666 nodeName:}" failed. No retries permitted until 2026-04-17 21:38:15.917608927 +0000 UTC m=+109.751496839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-tls") pod "node-exporter-j5tkg" (UID: "1f172aba-e062-4d68-a684-36ccee45d666") : secret "node-exporter-tls" not found Apr 17 21:38:15.417680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-textfile\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.417680 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1f172aba-e062-4d68-a684-36ccee45d666-root\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.418019 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-accelerators-collector-config\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.418019 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1f172aba-e062-4d68-a684-36ccee45d666-sys\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.418019 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.417957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f172aba-e062-4d68-a684-36ccee45d666-metrics-client-ca\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.418294 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.418272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-accelerators-collector-config\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.419784 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.419758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.428329 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.428274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fxn9\" (UniqueName: \"kubernetes.io/projected/1f172aba-e062-4d68-a684-36ccee45d666-kube-api-access-4fxn9\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.922823 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.922778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-tls\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:15.925112 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:15.925089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1f172aba-e062-4d68-a684-36ccee45d666-node-exporter-tls\") pod \"node-exporter-j5tkg\" (UID: \"1f172aba-e062-4d68-a684-36ccee45d666\") " pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:16.158070 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:16.158038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j5tkg" Apr 17 21:38:16.168860 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:38:16.168830 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f172aba_e062_4d68_a684_36ccee45d666.slice/crio-dc6387c779246c5cf201e7cf3fba884b006857d9d3229ca9ea2822738eba4aee WatchSource:0}: Error finding container dc6387c779246c5cf201e7cf3fba884b006857d9d3229ca9ea2822738eba4aee: Status 404 returned error can't find the container with id dc6387c779246c5cf201e7cf3fba884b006857d9d3229ca9ea2822738eba4aee Apr 17 21:38:16.253821 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:16.253723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j5tkg" event={"ID":"1f172aba-e062-4d68-a684-36ccee45d666","Type":"ContainerStarted","Data":"dc6387c779246c5cf201e7cf3fba884b006857d9d3229ca9ea2822738eba4aee"} Apr 17 21:38:17.258066 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:17.257980 2576 generic.go:358] "Generic (PLEG): container finished" podID="1f172aba-e062-4d68-a684-36ccee45d666" containerID="68ef77c2f41e78b9f59f3be120956653a2df01e975695ff9ba6077885eeeb5fe" exitCode=0 Apr 17 21:38:17.258066 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:17.258060 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j5tkg" event={"ID":"1f172aba-e062-4d68-a684-36ccee45d666","Type":"ContainerDied","Data":"68ef77c2f41e78b9f59f3be120956653a2df01e975695ff9ba6077885eeeb5fe"} Apr 17 21:38:18.263285 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:18.263248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j5tkg" event={"ID":"1f172aba-e062-4d68-a684-36ccee45d666","Type":"ContainerStarted","Data":"e075b1cf136b0cb685f8c18883a31fe485de94c296fd72853c1faffcc756b310"} Apr 17 21:38:18.263285 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:18.263286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j5tkg" event={"ID":"1f172aba-e062-4d68-a684-36ccee45d666","Type":"ContainerStarted","Data":"1808c733df21b45235a0ae60e241047969e44a1c5db243221f75f9c4b2e9d4c7"} Apr 17 21:38:18.282155 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:18.282071 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-j5tkg" podStartSLOduration=2.5280489 podStartE2EDuration="3.282052608s" podCreationTimestamp="2026-04-17 21:38:15 +0000 UTC" firstStartedPulling="2026-04-17 21:38:16.170754025 +0000 UTC m=+110.004641932" lastFinishedPulling="2026-04-17 21:38:16.92475773 +0000 UTC m=+110.758645640" observedRunningTime="2026-04-17 21:38:18.280855058 +0000 UTC m=+112.114742988" watchObservedRunningTime="2026-04-17 21:38:18.282052608 +0000 UTC m=+112.115940538" Apr 17 21:38:24.997406 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:24.997337 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5477664494-rrxhm" podUID="51814314-02b7-436f-83f8-4acfbdf378b7" containerName="registry" containerID="cri-o://2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229" gracePeriod=30 Apr 17 21:38:25.242333 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.242303 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:38:25.283793 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.283706 2576 generic.go:358] "Generic (PLEG): container finished" podID="cd17cea4-4ab5-4023-b0a8-ebd4db6056d6" containerID="a90a097220ff8fa7e75fb7f2f52784f39e136dab390601068279996f95a4f128" exitCode=0 Apr 17 21:38:25.283793 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.283778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-625wr" event={"ID":"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6","Type":"ContainerDied","Data":"a90a097220ff8fa7e75fb7f2f52784f39e136dab390601068279996f95a4f128"} Apr 17 21:38:25.284249 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.284228 2576 scope.go:117] "RemoveContainer" containerID="a90a097220ff8fa7e75fb7f2f52784f39e136dab390601068279996f95a4f128" Apr 17 21:38:25.285102 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.285062 2576 generic.go:358] "Generic (PLEG): container finished" podID="51814314-02b7-436f-83f8-4acfbdf378b7" containerID="2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229" exitCode=0 Apr 17 21:38:25.285197 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.285111 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5477664494-rrxhm" event={"ID":"51814314-02b7-436f-83f8-4acfbdf378b7","Type":"ContainerDied","Data":"2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229"} Apr 17 21:38:25.285197 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.285142 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5477664494-rrxhm" event={"ID":"51814314-02b7-436f-83f8-4acfbdf378b7","Type":"ContainerDied","Data":"387ca3133ac6abe3361d49860fbef8689573b5f046dcccf0673b983e8b9d6675"} Apr 17 21:38:25.285197 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.285145 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5477664494-rrxhm" Apr 17 21:38:25.285197 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.285158 2576 scope.go:117] "RemoveContainer" containerID="2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229" Apr 17 21:38:25.286604 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.286581 2576 generic.go:358] "Generic (PLEG): container finished" podID="4ee90e9e-74f6-4a93-b52b-9e82c5789ae2" containerID="1eeea76aa779a720380e91cbe02c73bb287ceb595e4b5eafcad9b41766f8b654" exitCode=0 Apr 17 21:38:25.286692 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.286620 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" event={"ID":"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2","Type":"ContainerDied","Data":"1eeea76aa779a720380e91cbe02c73bb287ceb595e4b5eafcad9b41766f8b654"} Apr 17 21:38:25.286910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.286898 2576 scope.go:117] "RemoveContainer" containerID="1eeea76aa779a720380e91cbe02c73bb287ceb595e4b5eafcad9b41766f8b654" Apr 17 21:38:25.295287 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.295263 2576 scope.go:117] "RemoveContainer" containerID="2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229" Apr 17 21:38:25.295694 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:38:25.295586 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229\": container with ID starting with 2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229 not found: ID does not exist" containerID="2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229" Apr 17 21:38:25.295694 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.295620 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229"} err="failed to get container status \"2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229\": rpc error: code = NotFound desc = could not find container \"2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229\": container with ID starting with 2e003e94871a5c954b2f45f7100e99dc5ca1b464848dc346e0e416643ccf9229 not found: ID does not exist" Apr 17 21:38:25.305956 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.305937 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clms7\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-kube-api-access-clms7\") pod \"51814314-02b7-436f-83f8-4acfbdf378b7\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " Apr 17 21:38:25.306070 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.305989 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-registry-certificates\") pod \"51814314-02b7-436f-83f8-4acfbdf378b7\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " Apr 17 21:38:25.306070 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.306036 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-bound-sa-token\") pod \"51814314-02b7-436f-83f8-4acfbdf378b7\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " Apr 17 21:38:25.306070 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.306088 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51814314-02b7-436f-83f8-4acfbdf378b7-ca-trust-extracted\") pod \"51814314-02b7-436f-83f8-4acfbdf378b7\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " Apr 17 21:38:25.306246 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.306126 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-image-registry-private-configuration\") pod \"51814314-02b7-436f-83f8-4acfbdf378b7\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " Apr 17 21:38:25.306246 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.306160 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-installation-pull-secrets\") pod \"51814314-02b7-436f-83f8-4acfbdf378b7\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " Apr 17 21:38:25.306343 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.306186 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-trusted-ca\") pod \"51814314-02b7-436f-83f8-4acfbdf378b7\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " Apr 17 21:38:25.306398 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.306339 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") pod \"51814314-02b7-436f-83f8-4acfbdf378b7\" (UID: \"51814314-02b7-436f-83f8-4acfbdf378b7\") " Apr 17 21:38:25.306521 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.306371 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "51814314-02b7-436f-83f8-4acfbdf378b7" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:38:25.306639 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.306619 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-registry-certificates\") on node \"ip-10-0-141-47.ec2.internal\" DevicePath \"\"" Apr 17 21:38:25.306883 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.306857 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "51814314-02b7-436f-83f8-4acfbdf378b7" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:38:25.310248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.310142 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-kube-api-access-clms7" (OuterVolumeSpecName: "kube-api-access-clms7") pod "51814314-02b7-436f-83f8-4acfbdf378b7" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7"). InnerVolumeSpecName "kube-api-access-clms7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:38:25.310248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.310216 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "51814314-02b7-436f-83f8-4acfbdf378b7" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:38:25.310519 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.310475 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "51814314-02b7-436f-83f8-4acfbdf378b7" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:38:25.310519 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.310478 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "51814314-02b7-436f-83f8-4acfbdf378b7" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:38:25.310519 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.310507 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "51814314-02b7-436f-83f8-4acfbdf378b7" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:38:25.316493 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.316467 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51814314-02b7-436f-83f8-4acfbdf378b7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "51814314-02b7-436f-83f8-4acfbdf378b7" (UID: "51814314-02b7-436f-83f8-4acfbdf378b7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:38:25.407449 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.407377 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-installation-pull-secrets\") on node \"ip-10-0-141-47.ec2.internal\" DevicePath \"\"" Apr 17 21:38:25.407449 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.407413 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51814314-02b7-436f-83f8-4acfbdf378b7-trusted-ca\") on node \"ip-10-0-141-47.ec2.internal\" DevicePath \"\"" Apr 17 21:38:25.407449 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.407429 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-registry-tls\") on node \"ip-10-0-141-47.ec2.internal\" DevicePath \"\"" Apr 17 21:38:25.407449 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.407442 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clms7\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-kube-api-access-clms7\") on node \"ip-10-0-141-47.ec2.internal\" DevicePath \"\"" Apr 17 21:38:25.407449 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.407457 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51814314-02b7-436f-83f8-4acfbdf378b7-bound-sa-token\") on node \"ip-10-0-141-47.ec2.internal\" DevicePath \"\"" Apr 17 21:38:25.407827 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.407471 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51814314-02b7-436f-83f8-4acfbdf378b7-ca-trust-extracted\") on node \"ip-10-0-141-47.ec2.internal\" DevicePath \"\"" Apr 17 21:38:25.407827 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.407487 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/51814314-02b7-436f-83f8-4acfbdf378b7-image-registry-private-configuration\") on node \"ip-10-0-141-47.ec2.internal\" DevicePath \"\"" Apr 17 21:38:25.606782 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.606749 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5477664494-rrxhm"] Apr 17 21:38:25.612328 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.612298 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5477664494-rrxhm"] Apr 17 21:38:25.724876 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.724839 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7646cfc968-5h87d_26507fb0-1d13-41d7-b18c-dfbaf034b8e0/router/0.log" Apr 17 21:38:25.731538 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:25.731511 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8wmk5_6a2f302e-4951-4581-a1e6-f71a43573912/serve-healthcheck-canary/0.log" Apr 17 21:38:26.292848 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:26.292807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qwfp5" event={"ID":"4ee90e9e-74f6-4a93-b52b-9e82c5789ae2","Type":"ContainerStarted","Data":"d2e0b4a87d3a6dc090476a2c9c339b2c58fd6de78376f375072edd04e244a8f6"} Apr 17 21:38:26.294573 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:26.294540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-625wr" event={"ID":"cd17cea4-4ab5-4023-b0a8-ebd4db6056d6","Type":"ContainerStarted","Data":"1bf1f3cd53dcc9af45aa91692f595df37e0ad546b5bdd85edad3a26fa8bb7745"} Apr 17 21:38:26.745040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:26.745008 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51814314-02b7-436f-83f8-4acfbdf378b7" path="/var/lib/kubelet/pods/51814314-02b7-436f-83f8-4acfbdf378b7/volumes" Apr 17 21:38:30.581449 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:30.581409 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" podUID="1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 21:38:36.505287 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:36.505251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:38:36.507646 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:36.507614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/697e918d-013d-41df-9440-059bd3d99a19-metrics-certs\") pod \"network-metrics-daemon-lncjj\" (UID: \"697e918d-013d-41df-9440-059bd3d99a19\") " pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:38:36.561687 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:36.561658 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nlncx\"" Apr 17 21:38:36.569780 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:36.569744 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lncjj" Apr 17 21:38:36.702552 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:36.702525 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lncjj"] Apr 17 21:38:36.705270 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:38:36.705234 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod697e918d_013d_41df_9440_059bd3d99a19.slice/crio-f3443ac6abeb950fd000447f7e2d561b3d2a88fdc6b4cd8009c6d04372bf327e WatchSource:0}: Error finding container f3443ac6abeb950fd000447f7e2d561b3d2a88fdc6b4cd8009c6d04372bf327e: Status 404 returned error can't find the container with id f3443ac6abeb950fd000447f7e2d561b3d2a88fdc6b4cd8009c6d04372bf327e Apr 17 21:38:37.329002 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:37.328952 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lncjj" event={"ID":"697e918d-013d-41df-9440-059bd3d99a19","Type":"ContainerStarted","Data":"f3443ac6abeb950fd000447f7e2d561b3d2a88fdc6b4cd8009c6d04372bf327e"} Apr 17 21:38:38.333420 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:38.333383 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lncjj" event={"ID":"697e918d-013d-41df-9440-059bd3d99a19","Type":"ContainerStarted","Data":"272df825370060a4bdaa638edeaa00b543b18b4a6035343ca7839de95a424056"} Apr 17 21:38:38.333420 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:38.333426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lncjj" event={"ID":"697e918d-013d-41df-9440-059bd3d99a19","Type":"ContainerStarted","Data":"a21cd6147f480afd582b3de68d96ea8f401c56df32819b2b868855b41a5096fd"} Apr 17 21:38:38.348930 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:38.348876 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lncjj" podStartSLOduration=131.174901102 podStartE2EDuration="2m12.348859911s" podCreationTimestamp="2026-04-17 21:36:26 +0000 UTC" firstStartedPulling="2026-04-17 21:38:36.707518411 +0000 UTC m=+130.541406321" lastFinishedPulling="2026-04-17 21:38:37.881477216 +0000 UTC m=+131.715365130" observedRunningTime="2026-04-17 21:38:38.347799264 +0000 UTC m=+132.181687215" watchObservedRunningTime="2026-04-17 21:38:38.348859911 +0000 UTC m=+132.182747839" Apr 17 21:38:39.338873 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:39.338713 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee63485a-d5c2-456b-998c-2c74fec67448" containerID="fb9f7723263016cac7ad2c7c80ebe23c7cf21d59d042d5c4cb5cf6f98bff28b1" exitCode=0 Apr 17 21:38:39.340303 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:39.339398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" event={"ID":"ee63485a-d5c2-456b-998c-2c74fec67448","Type":"ContainerDied","Data":"fb9f7723263016cac7ad2c7c80ebe23c7cf21d59d042d5c4cb5cf6f98bff28b1"} Apr 17 21:38:39.340303 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:39.339781 2576 scope.go:117] "RemoveContainer" containerID="fb9f7723263016cac7ad2c7c80ebe23c7cf21d59d042d5c4cb5cf6f98bff28b1" Apr 17 21:38:40.344569 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:40.344538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rn4mk" event={"ID":"ee63485a-d5c2-456b-998c-2c74fec67448","Type":"ContainerStarted","Data":"33b5ec6ca55587e7c831b11f20ce979650c9838169853f563a4c4e766da0220b"} Apr 17 21:38:40.581041 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:40.581001 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" podUID="1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 21:38:50.581807 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:50.581759 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" podUID="1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 21:38:50.582232 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:50.581849 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" Apr 17 21:38:50.582435 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:50.582416 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"a3a04de7738a7cbc037e9304da23a4b289a2f437f74c7bf349101961222a5856"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 21:38:50.582472 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:50.582458 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" podUID="1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0" containerName="service-proxy" containerID="cri-o://a3a04de7738a7cbc037e9304da23a4b289a2f437f74c7bf349101961222a5856" gracePeriod=30 Apr 17 21:38:51.378882 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:51.378845 2576 generic.go:358] "Generic (PLEG): container finished" podID="1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0" containerID="a3a04de7738a7cbc037e9304da23a4b289a2f437f74c7bf349101961222a5856" exitCode=2 Apr 17 21:38:51.379091 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:51.378912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" event={"ID":"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0","Type":"ContainerDied","Data":"a3a04de7738a7cbc037e9304da23a4b289a2f437f74c7bf349101961222a5856"} Apr 17 21:38:51.379091 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:38:51.378951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57b565c99f-5gdx6" event={"ID":"1c1e78c2-ccae-4f8f-9f03-beb0b18a23f0","Type":"ContainerStarted","Data":"1ce452a77f260b7bdc7d695c9d865b35217193f6dab1ab1f38c2ea33a80b6af4"} Apr 17 21:41:26.683232 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:26.683201 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:41:26.684330 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:26.684310 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:41:26.694919 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:26.694895 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 21:41:46.668594 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.668557 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88"] Apr 17 21:41:46.670945 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.668845 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51814314-02b7-436f-83f8-4acfbdf378b7" containerName="registry" Apr 17 21:41:46.670945 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.668858 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="51814314-02b7-436f-83f8-4acfbdf378b7" containerName="registry" Apr 17 21:41:46.670945 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.668933 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="51814314-02b7-436f-83f8-4acfbdf378b7" containerName="registry" Apr 17 21:41:46.671910 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.671889 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:46.674655 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.674630 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 21:41:46.674800 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.674778 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 21:41:46.674864 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.674805 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-lx4wk\"" Apr 17 21:41:46.674921 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.674910 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 21:41:46.674969 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.674913 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 21:41:46.683301 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.683273 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88"] Apr 17 21:41:46.763663 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.763624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b8ea2aa-bb66-4601-8120-6418a6c3d99b-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-46h88\" (UID: \"9b8ea2aa-bb66-4601-8120-6418a6c3d99b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:46.763846 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.763686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b8ea2aa-bb66-4601-8120-6418a6c3d99b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-46h88\" (UID: \"9b8ea2aa-bb66-4601-8120-6418a6c3d99b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:46.763846 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.763754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrbm\" (UniqueName: \"kubernetes.io/projected/9b8ea2aa-bb66-4601-8120-6418a6c3d99b-kube-api-access-jvrbm\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-46h88\" (UID: \"9b8ea2aa-bb66-4601-8120-6418a6c3d99b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:46.864432 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.864389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrbm\" (UniqueName: \"kubernetes.io/projected/9b8ea2aa-bb66-4601-8120-6418a6c3d99b-kube-api-access-jvrbm\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-46h88\" (UID: \"9b8ea2aa-bb66-4601-8120-6418a6c3d99b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:46.864613 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.864444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b8ea2aa-bb66-4601-8120-6418a6c3d99b-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-46h88\" (UID: \"9b8ea2aa-bb66-4601-8120-6418a6c3d99b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:46.864613 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.864470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b8ea2aa-bb66-4601-8120-6418a6c3d99b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-46h88\" (UID: \"9b8ea2aa-bb66-4601-8120-6418a6c3d99b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:46.866944 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.866918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b8ea2aa-bb66-4601-8120-6418a6c3d99b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-46h88\" (UID: \"9b8ea2aa-bb66-4601-8120-6418a6c3d99b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:46.867066 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.866929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b8ea2aa-bb66-4601-8120-6418a6c3d99b-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-46h88\" (UID: \"9b8ea2aa-bb66-4601-8120-6418a6c3d99b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:46.872368 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.872342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrbm\" (UniqueName: \"kubernetes.io/projected/9b8ea2aa-bb66-4601-8120-6418a6c3d99b-kube-api-access-jvrbm\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-46h88\" (UID: \"9b8ea2aa-bb66-4601-8120-6418a6c3d99b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:46.983193 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:46.983100 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:47.110522 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:47.110492 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88"] Apr 17 21:41:47.113256 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:41:47.113222 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8ea2aa_bb66_4601_8120_6418a6c3d99b.slice/crio-daf7a793af60f6f7acd3ccad75da8d632f313d9b62ce387fd455028b3d83cba3 WatchSource:0}: Error finding container daf7a793af60f6f7acd3ccad75da8d632f313d9b62ce387fd455028b3d83cba3: Status 404 returned error can't find the container with id daf7a793af60f6f7acd3ccad75da8d632f313d9b62ce387fd455028b3d83cba3 Apr 17 21:41:47.115097 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:47.115058 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:41:47.885281 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:47.885241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" event={"ID":"9b8ea2aa-bb66-4601-8120-6418a6c3d99b","Type":"ContainerStarted","Data":"daf7a793af60f6f7acd3ccad75da8d632f313d9b62ce387fd455028b3d83cba3"} Apr 17 21:41:49.893617 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:49.893581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" event={"ID":"9b8ea2aa-bb66-4601-8120-6418a6c3d99b","Type":"ContainerStarted","Data":"8ba5c41a23172a8dc92c1c21a9b83490181c8cb59c1989f3948fa73c30d6678b"} Apr 17 21:41:49.894016 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:49.893712 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:41:49.915723 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:41:49.915656 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" podStartSLOduration=1.538747259 podStartE2EDuration="3.915639852s" podCreationTimestamp="2026-04-17 21:41:46 +0000 UTC" firstStartedPulling="2026-04-17 21:41:47.115276607 +0000 UTC m=+320.949164518" lastFinishedPulling="2026-04-17 21:41:49.492169202 +0000 UTC m=+323.326057111" observedRunningTime="2026-04-17 21:41:49.913359569 +0000 UTC m=+323.747247522" watchObservedRunningTime="2026-04-17 21:41:49.915639852 +0000 UTC m=+323.749527780" Apr 17 21:42:00.899330 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:00.899287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-46h88" Apr 17 21:42:04.558621 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.558581 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq"] Apr 17 21:42:04.562037 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.562016 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.565653 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.565629 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 21:42:04.565653 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.565629 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 21:42:04.565815 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.565634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-clct9\"" Apr 17 21:42:04.571069 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.571049 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq"] Apr 17 21:42:04.712486 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.712447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0ac529e-2275-450c-89eb-219a1b2af810-tls-certs\") pod \"kube-auth-proxy-79c9d94f8f-6r4gq\" (UID: \"c0ac529e-2275-450c-89eb-219a1b2af810\") " pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.712653 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.712568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0ac529e-2275-450c-89eb-219a1b2af810-tmp\") pod \"kube-auth-proxy-79c9d94f8f-6r4gq\" (UID: \"c0ac529e-2275-450c-89eb-219a1b2af810\") " pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.712709 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.712668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltz6z\" (UniqueName: \"kubernetes.io/projected/c0ac529e-2275-450c-89eb-219a1b2af810-kube-api-access-ltz6z\") pod \"kube-auth-proxy-79c9d94f8f-6r4gq\" (UID: \"c0ac529e-2275-450c-89eb-219a1b2af810\") " pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.813309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.813210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0ac529e-2275-450c-89eb-219a1b2af810-tmp\") pod \"kube-auth-proxy-79c9d94f8f-6r4gq\" (UID: \"c0ac529e-2275-450c-89eb-219a1b2af810\") " pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.813309 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.813309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltz6z\" (UniqueName: \"kubernetes.io/projected/c0ac529e-2275-450c-89eb-219a1b2af810-kube-api-access-ltz6z\") pod \"kube-auth-proxy-79c9d94f8f-6r4gq\" (UID: \"c0ac529e-2275-450c-89eb-219a1b2af810\") " pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.813515 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.813365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0ac529e-2275-450c-89eb-219a1b2af810-tls-certs\") pod \"kube-auth-proxy-79c9d94f8f-6r4gq\" (UID: \"c0ac529e-2275-450c-89eb-219a1b2af810\") " pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.815774 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.815745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0ac529e-2275-450c-89eb-219a1b2af810-tmp\") pod \"kube-auth-proxy-79c9d94f8f-6r4gq\" (UID: \"c0ac529e-2275-450c-89eb-219a1b2af810\") " pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.816104 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.816065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0ac529e-2275-450c-89eb-219a1b2af810-tls-certs\") pod \"kube-auth-proxy-79c9d94f8f-6r4gq\" (UID: \"c0ac529e-2275-450c-89eb-219a1b2af810\") " pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.827278 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.827247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltz6z\" (UniqueName: \"kubernetes.io/projected/c0ac529e-2275-450c-89eb-219a1b2af810-kube-api-access-ltz6z\") pod \"kube-auth-proxy-79c9d94f8f-6r4gq\" (UID: \"c0ac529e-2275-450c-89eb-219a1b2af810\") " pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.871912 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.871872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" Apr 17 21:42:04.997946 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:04.997920 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq"] Apr 17 21:42:05.000598 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:42:05.000568 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ac529e_2275_450c_89eb_219a1b2af810.slice/crio-53c4b90859a90e332f8837d9cd148f51795058887764931ad75c63f68492b1ab WatchSource:0}: Error finding container 53c4b90859a90e332f8837d9cd148f51795058887764931ad75c63f68492b1ab: Status 404 returned error can't find the container with id 53c4b90859a90e332f8837d9cd148f51795058887764931ad75c63f68492b1ab Apr 17 21:42:05.949232 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:05.949156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" event={"ID":"c0ac529e-2275-450c-89eb-219a1b2af810","Type":"ContainerStarted","Data":"53c4b90859a90e332f8837d9cd148f51795058887764931ad75c63f68492b1ab"} Apr 17 21:42:08.091376 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.091345 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-4tmcs"] Apr 17 21:42:08.094379 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.094352 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:08.096636 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.096614 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 17 21:42:08.096755 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.096643 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-66d9g\"" Apr 17 21:42:08.102578 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.102531 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-4tmcs"] Apr 17 21:42:08.242337 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.242305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swpwd\" (UniqueName: \"kubernetes.io/projected/4a69880f-4007-485f-88af-595348fe4dab-kube-api-access-swpwd\") pod \"odh-model-controller-858dbf95b8-4tmcs\" (UID: \"4a69880f-4007-485f-88af-595348fe4dab\") " pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:08.242505 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.242375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a69880f-4007-485f-88af-595348fe4dab-cert\") pod \"odh-model-controller-858dbf95b8-4tmcs\" (UID: \"4a69880f-4007-485f-88af-595348fe4dab\") " pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:08.343847 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.343438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swpwd\" (UniqueName: \"kubernetes.io/projected/4a69880f-4007-485f-88af-595348fe4dab-kube-api-access-swpwd\") pod \"odh-model-controller-858dbf95b8-4tmcs\" (UID: \"4a69880f-4007-485f-88af-595348fe4dab\") " pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:08.343847 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.343555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a69880f-4007-485f-88af-595348fe4dab-cert\") pod \"odh-model-controller-858dbf95b8-4tmcs\" (UID: \"4a69880f-4007-485f-88af-595348fe4dab\") " pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:08.343847 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:42:08.343699 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 21:42:08.343847 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:42:08.343762 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a69880f-4007-485f-88af-595348fe4dab-cert podName:4a69880f-4007-485f-88af-595348fe4dab nodeName:}" failed. No retries permitted until 2026-04-17 21:42:08.843741541 +0000 UTC m=+342.677629455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a69880f-4007-485f-88af-595348fe4dab-cert") pod "odh-model-controller-858dbf95b8-4tmcs" (UID: "4a69880f-4007-485f-88af-595348fe4dab") : secret "odh-model-controller-webhook-cert" not found Apr 17 21:42:08.352193 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.352165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swpwd\" (UniqueName: \"kubernetes.io/projected/4a69880f-4007-485f-88af-595348fe4dab-kube-api-access-swpwd\") pod \"odh-model-controller-858dbf95b8-4tmcs\" (UID: \"4a69880f-4007-485f-88af-595348fe4dab\") " pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:08.846778 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.846750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a69880f-4007-485f-88af-595348fe4dab-cert\") pod \"odh-model-controller-858dbf95b8-4tmcs\" (UID: \"4a69880f-4007-485f-88af-595348fe4dab\") " pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:08.849237 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.849207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a69880f-4007-485f-88af-595348fe4dab-cert\") pod \"odh-model-controller-858dbf95b8-4tmcs\" (UID: \"4a69880f-4007-485f-88af-595348fe4dab\") " pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:08.960797 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.960758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" event={"ID":"c0ac529e-2275-450c-89eb-219a1b2af810","Type":"ContainerStarted","Data":"431db3c190c9f2d374789486589e9e49766f1f6e8a2ce0a9182c608cf2f4c78c"} Apr 17 21:42:08.978042 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:08.977986 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-79c9d94f8f-6r4gq" podStartSLOduration=1.95995838 podStartE2EDuration="4.977971308s" podCreationTimestamp="2026-04-17 21:42:04 +0000 UTC" firstStartedPulling="2026-04-17 21:42:05.002269034 +0000 UTC m=+338.836156941" lastFinishedPulling="2026-04-17 21:42:08.020281949 +0000 UTC m=+341.854169869" observedRunningTime="2026-04-17 21:42:08.976393321 +0000 UTC m=+342.810281248" watchObservedRunningTime="2026-04-17 21:42:08.977971308 +0000 UTC m=+342.811859237" Apr 17 21:42:09.006122 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:09.006054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:09.132352 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:09.132252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-4tmcs"] Apr 17 21:42:09.135190 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:42:09.135157 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a69880f_4007_485f_88af_595348fe4dab.slice/crio-7256effeabe985d5121c2099d89ce8dbefc0781494b1c25b602f191deda07afd WatchSource:0}: Error finding container 7256effeabe985d5121c2099d89ce8dbefc0781494b1c25b602f191deda07afd: Status 404 returned error can't find the container with id 7256effeabe985d5121c2099d89ce8dbefc0781494b1c25b602f191deda07afd Apr 17 21:42:09.965491 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:09.965448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" event={"ID":"4a69880f-4007-485f-88af-595348fe4dab","Type":"ContainerStarted","Data":"7256effeabe985d5121c2099d89ce8dbefc0781494b1c25b602f191deda07afd"} Apr 17 21:42:11.973397 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:11.973363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" event={"ID":"4a69880f-4007-485f-88af-595348fe4dab","Type":"ContainerStarted","Data":"8760c0e4a2fe6ed6df63de260cb69ea1b7fbcd5260b6758827ba5dede66e4252"} Apr 17 21:42:11.973775 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:11.973439 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:11.988962 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:11.988912 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" podStartSLOduration=1.243290139 podStartE2EDuration="3.988899057s" podCreationTimestamp="2026-04-17 21:42:08 +0000 UTC" firstStartedPulling="2026-04-17 21:42:09.136495391 +0000 UTC m=+342.970383300" lastFinishedPulling="2026-04-17 21:42:11.882104297 +0000 UTC m=+345.715992218" observedRunningTime="2026-04-17 21:42:11.987688295 +0000 UTC m=+345.821576207" watchObservedRunningTime="2026-04-17 21:42:11.988899057 +0000 UTC m=+345.822786984" Apr 17 21:42:12.978506 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:12.978471 2576 generic.go:358] "Generic (PLEG): container finished" podID="4a69880f-4007-485f-88af-595348fe4dab" containerID="8760c0e4a2fe6ed6df63de260cb69ea1b7fbcd5260b6758827ba5dede66e4252" exitCode=1 Apr 17 21:42:12.978951 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:12.978521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" event={"ID":"4a69880f-4007-485f-88af-595348fe4dab","Type":"ContainerDied","Data":"8760c0e4a2fe6ed6df63de260cb69ea1b7fbcd5260b6758827ba5dede66e4252"} Apr 17 21:42:12.978951 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:12.978847 2576 scope.go:117] "RemoveContainer" containerID="8760c0e4a2fe6ed6df63de260cb69ea1b7fbcd5260b6758827ba5dede66e4252" Apr 17 21:42:13.982795 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:13.982757 2576 generic.go:358] "Generic (PLEG): container finished" podID="4a69880f-4007-485f-88af-595348fe4dab" containerID="730e9efb506f4e74094a5eb0013178ffc682ce9ceffee9908e76e20a952c6c08" exitCode=1 Apr 17 21:42:13.983264 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:13.982839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" event={"ID":"4a69880f-4007-485f-88af-595348fe4dab","Type":"ContainerDied","Data":"730e9efb506f4e74094a5eb0013178ffc682ce9ceffee9908e76e20a952c6c08"} Apr 17 21:42:13.983264 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:13.982880 2576 scope.go:117] "RemoveContainer" containerID="8760c0e4a2fe6ed6df63de260cb69ea1b7fbcd5260b6758827ba5dede66e4252" Apr 17 21:42:13.983264 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:13.983152 2576 scope.go:117] "RemoveContainer" containerID="730e9efb506f4e74094a5eb0013178ffc682ce9ceffee9908e76e20a952c6c08" Apr 17 21:42:13.983438 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:42:13.983387 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-4tmcs_opendatahub(4a69880f-4007-485f-88af-595348fe4dab)\"" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" podUID="4a69880f-4007-485f-88af-595348fe4dab" Apr 17 21:42:14.106456 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.106422 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-xblx8"] Apr 17 21:42:14.110849 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.110823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:42:14.113619 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.113429 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 17 21:42:14.113619 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.113571 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-njjnj\"" Apr 17 21:42:14.119194 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.119155 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-xblx8"] Apr 17 21:42:14.183394 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.183357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85n4f\" (UniqueName: \"kubernetes.io/projected/0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec-kube-api-access-85n4f\") pod \"kserve-controller-manager-856948b99f-xblx8\" (UID: \"0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec\") " pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:42:14.183394 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.183394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec-cert\") pod \"kserve-controller-manager-856948b99f-xblx8\" (UID: \"0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec\") " pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:42:14.284447 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.284363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85n4f\" (UniqueName: \"kubernetes.io/projected/0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec-kube-api-access-85n4f\") pod \"kserve-controller-manager-856948b99f-xblx8\" (UID: \"0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec\") " pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:42:14.284447 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.284401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec-cert\") pod \"kserve-controller-manager-856948b99f-xblx8\" (UID: \"0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec\") " pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:42:14.284622 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:42:14.284504 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 21:42:14.284622 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:42:14.284557 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec-cert podName:0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec nodeName:}" failed. No retries permitted until 2026-04-17 21:42:14.784542355 +0000 UTC m=+348.618430264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec-cert") pod "kserve-controller-manager-856948b99f-xblx8" (UID: "0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec") : secret "kserve-webhook-server-cert" not found Apr 17 21:42:14.293791 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.293765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85n4f\" (UniqueName: \"kubernetes.io/projected/0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec-kube-api-access-85n4f\") pod \"kserve-controller-manager-856948b99f-xblx8\" (UID: \"0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec\") " pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:42:14.788647 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.788595 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec-cert\") pod \"kserve-controller-manager-856948b99f-xblx8\" (UID: \"0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec\") " pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:42:14.790998 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.790977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec-cert\") pod \"kserve-controller-manager-856948b99f-xblx8\" (UID: \"0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec\") " pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:42:14.987836 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:14.987804 2576 scope.go:117] "RemoveContainer" containerID="730e9efb506f4e74094a5eb0013178ffc682ce9ceffee9908e76e20a952c6c08" Apr 17 21:42:14.988251 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:42:14.987989 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-4tmcs_opendatahub(4a69880f-4007-485f-88af-595348fe4dab)\"" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" podUID="4a69880f-4007-485f-88af-595348fe4dab" Apr 17 21:42:15.023975 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:15.023935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:42:15.150869 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:15.150846 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-xblx8"] Apr 17 21:42:15.152733 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:42:15.152703 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cfa0f82_7fd4_413b_b0e8_1ad64764a7ec.slice/crio-604ccd453d5d62aa08a2a7d5bf784c8f15992f7675ba86d9b6f377beeba2578c WatchSource:0}: Error finding container 604ccd453d5d62aa08a2a7d5bf784c8f15992f7675ba86d9b6f377beeba2578c: Status 404 returned error can't find the container with id 604ccd453d5d62aa08a2a7d5bf784c8f15992f7675ba86d9b6f377beeba2578c Apr 17 21:42:15.991211 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:15.991128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" event={"ID":"0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec","Type":"ContainerStarted","Data":"604ccd453d5d62aa08a2a7d5bf784c8f15992f7675ba86d9b6f377beeba2578c"} Apr 17 21:42:18.000315 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.000277 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" event={"ID":"0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec","Type":"ContainerStarted","Data":"d8ed72f217cedb4c704c9062f9b62d90aff67d2071b2ff8ec0b8b0e0b78929eb"} Apr 17 21:42:18.000745 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.000396 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:42:18.023945 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.023837 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" podStartSLOduration=1.417601826 podStartE2EDuration="4.023820098s" podCreationTimestamp="2026-04-17 21:42:14 +0000 UTC" firstStartedPulling="2026-04-17 21:42:15.154169767 +0000 UTC m=+348.988057676" lastFinishedPulling="2026-04-17 21:42:17.760388028 +0000 UTC m=+351.594275948" observedRunningTime="2026-04-17 21:42:18.021340669 +0000 UTC m=+351.855228598" watchObservedRunningTime="2026-04-17 21:42:18.023820098 +0000 UTC m=+351.857708027" Apr 17 21:42:18.041040 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.041005 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln"] Apr 17 21:42:18.044235 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.044217 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" Apr 17 21:42:18.047426 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.047404 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-gwzr7\"" Apr 17 21:42:18.047558 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.047482 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 21:42:18.047618 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.047595 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 21:42:18.072792 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.072758 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln"] Apr 17 21:42:18.114791 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.114747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/fad3f3da-91c9-4d73-8835-0bd634d3f4c4-operator-config\") pod \"servicemesh-operator3-55f49c5f94-pp4ln\" (UID: \"fad3f3da-91c9-4d73-8835-0bd634d3f4c4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" Apr 17 21:42:18.114791 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.114793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms8f5\" (UniqueName: \"kubernetes.io/projected/fad3f3da-91c9-4d73-8835-0bd634d3f4c4-kube-api-access-ms8f5\") pod \"servicemesh-operator3-55f49c5f94-pp4ln\" (UID: \"fad3f3da-91c9-4d73-8835-0bd634d3f4c4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" Apr 17 21:42:18.215385 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.215350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ms8f5\" (UniqueName: \"kubernetes.io/projected/fad3f3da-91c9-4d73-8835-0bd634d3f4c4-kube-api-access-ms8f5\") pod \"servicemesh-operator3-55f49c5f94-pp4ln\" (UID: \"fad3f3da-91c9-4d73-8835-0bd634d3f4c4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" Apr 17 21:42:18.215568 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.215445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/fad3f3da-91c9-4d73-8835-0bd634d3f4c4-operator-config\") pod \"servicemesh-operator3-55f49c5f94-pp4ln\" (UID: \"fad3f3da-91c9-4d73-8835-0bd634d3f4c4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" Apr 17 21:42:18.217949 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.217917 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/fad3f3da-91c9-4d73-8835-0bd634d3f4c4-operator-config\") pod \"servicemesh-operator3-55f49c5f94-pp4ln\" (UID: \"fad3f3da-91c9-4d73-8835-0bd634d3f4c4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" Apr 17 21:42:18.224927 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.224901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms8f5\" (UniqueName: \"kubernetes.io/projected/fad3f3da-91c9-4d73-8835-0bd634d3f4c4-kube-api-access-ms8f5\") pod \"servicemesh-operator3-55f49c5f94-pp4ln\" (UID: \"fad3f3da-91c9-4d73-8835-0bd634d3f4c4\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" Apr 17 21:42:18.353798 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.353760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" Apr 17 21:42:18.485169 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:18.485087 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln"] Apr 17 21:42:18.488104 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:42:18.488062 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfad3f3da_91c9_4d73_8835_0bd634d3f4c4.slice/crio-c3b97763f94c7660cbc6a4eca713522f801f8ff1e7e721fbfe07fa3717c1cbef WatchSource:0}: Error finding container c3b97763f94c7660cbc6a4eca713522f801f8ff1e7e721fbfe07fa3717c1cbef: Status 404 returned error can't find the container with id c3b97763f94c7660cbc6a4eca713522f801f8ff1e7e721fbfe07fa3717c1cbef Apr 17 21:42:19.004619 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:19.004583 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" event={"ID":"fad3f3da-91c9-4d73-8835-0bd634d3f4c4","Type":"ContainerStarted","Data":"c3b97763f94c7660cbc6a4eca713522f801f8ff1e7e721fbfe07fa3717c1cbef"} Apr 17 21:42:21.973787 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:21.973755 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:21.974219 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:21.974198 2576 scope.go:117] "RemoveContainer" containerID="730e9efb506f4e74094a5eb0013178ffc682ce9ceffee9908e76e20a952c6c08" Apr 17 21:42:21.974415 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:42:21.974392 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-4tmcs_opendatahub(4a69880f-4007-485f-88af-595348fe4dab)\"" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" podUID="4a69880f-4007-485f-88af-595348fe4dab" Apr 17 21:42:23.021729 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:23.021639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" event={"ID":"fad3f3da-91c9-4d73-8835-0bd634d3f4c4","Type":"ContainerStarted","Data":"79f1d0f91aa066c24c929bbb92b01d0450134a7fbf99a4b7fb7ab91a900a9f35"} Apr 17 21:42:23.022227 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:23.021780 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" Apr 17 21:42:23.044922 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:23.044860 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" podStartSLOduration=0.785165009 podStartE2EDuration="5.044840816s" podCreationTimestamp="2026-04-17 21:42:18 +0000 UTC" firstStartedPulling="2026-04-17 21:42:18.491760375 +0000 UTC m=+352.325648284" lastFinishedPulling="2026-04-17 21:42:22.751436182 +0000 UTC m=+356.585324091" observedRunningTime="2026-04-17 21:42:23.041879607 +0000 UTC m=+356.875767564" watchObservedRunningTime="2026-04-17 21:42:23.044840816 +0000 UTC m=+356.878728745" Apr 17 21:42:28.173135 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.173096 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l"] Apr 17 21:42:28.182799 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.182774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.185410 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.185384 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 21:42:28.185410 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.185396 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 21:42:28.185712 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.185694 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 21:42:28.185900 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.185815 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-6xwkw\"" Apr 17 21:42:28.186046 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.185749 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 21:42:28.188940 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.188919 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l"] Apr 17 21:42:28.306960 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.306920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/44fed5ef-59c3-44b3-81d4-e05bb6339307-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.306960 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.306966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/44fed5ef-59c3-44b3-81d4-e05bb6339307-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.307198 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.306990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shx5w\" (UniqueName: \"kubernetes.io/projected/44fed5ef-59c3-44b3-81d4-e05bb6339307-kube-api-access-shx5w\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.307198 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.307021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.307198 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.307053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.307198 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.307161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.307198 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.307191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.408260 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.408221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/44fed5ef-59c3-44b3-81d4-e05bb6339307-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.408260 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.408258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/44fed5ef-59c3-44b3-81d4-e05bb6339307-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.408515 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.408279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shx5w\" (UniqueName: \"kubernetes.io/projected/44fed5ef-59c3-44b3-81d4-e05bb6339307-kube-api-access-shx5w\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.408515 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.408298 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.408515 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.408326 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.408764 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.408738 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.408935 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.408904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.409055 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.409014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.410789 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.410761 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/44fed5ef-59c3-44b3-81d4-e05bb6339307-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.410921 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.410901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/44fed5ef-59c3-44b3-81d4-e05bb6339307-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.410977 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.410928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.411016 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.410988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.416025 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.415988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shx5w\" (UniqueName: \"kubernetes.io/projected/44fed5ef-59c3-44b3-81d4-e05bb6339307-kube-api-access-shx5w\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.416588 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.416571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/44fed5ef-59c3-44b3-81d4-e05bb6339307-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-4z57l\" (UID: \"44fed5ef-59c3-44b3-81d4-e05bb6339307\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.494652 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.494550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:28.639262 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:28.639234 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l"] Apr 17 21:42:28.641137 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:42:28.641103 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44fed5ef_59c3_44b3_81d4_e05bb6339307.slice/crio-7e3f727b30dfb661f9d541d6d9689dad5108815905d06d09f3e7e8b626f713b3 WatchSource:0}: Error finding container 7e3f727b30dfb661f9d541d6d9689dad5108815905d06d09f3e7e8b626f713b3: Status 404 returned error can't find the container with id 7e3f727b30dfb661f9d541d6d9689dad5108815905d06d09f3e7e8b626f713b3 Apr 17 21:42:29.006700 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:29.006644 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:29.007119 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:29.007104 2576 scope.go:117] "RemoveContainer" containerID="730e9efb506f4e74094a5eb0013178ffc682ce9ceffee9908e76e20a952c6c08" Apr 17 21:42:29.043186 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:29.043147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" event={"ID":"44fed5ef-59c3-44b3-81d4-e05bb6339307","Type":"ContainerStarted","Data":"7e3f727b30dfb661f9d541d6d9689dad5108815905d06d09f3e7e8b626f713b3"} Apr 17 21:42:30.049043 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:30.049010 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" event={"ID":"4a69880f-4007-485f-88af-595348fe4dab","Type":"ContainerStarted","Data":"2c1ca6b893243e12ee3884cd100bce20fcedeb8f04fe86b44fcfc8c6d7af1357"} Apr 17 21:42:30.049503 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:30.049259 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:31.323411 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:31.323374 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 21:42:31.323672 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:31.323447 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 17 21:42:32.058163 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:32.058123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" event={"ID":"44fed5ef-59c3-44b3-81d4-e05bb6339307","Type":"ContainerStarted","Data":"ea8897d38212c8ec9db0dcd59ba4144490ec02b209272fecbb8eea0624c166e4"} Apr 17 21:42:32.058380 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:32.058346 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:32.059935 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:32.059907 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-4z57l container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 21:42:32.060054 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:32.059962 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" podUID="44fed5ef-59c3-44b3-81d4-e05bb6339307" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 21:42:32.077999 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:32.077939 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" podStartSLOduration=1.398004089 podStartE2EDuration="4.077921827s" podCreationTimestamp="2026-04-17 21:42:28 +0000 UTC" firstStartedPulling="2026-04-17 21:42:28.643212354 +0000 UTC m=+362.477100260" lastFinishedPulling="2026-04-17 21:42:31.323130092 +0000 UTC m=+365.157017998" observedRunningTime="2026-04-17 21:42:32.075929345 +0000 UTC m=+365.909817274" watchObservedRunningTime="2026-04-17 21:42:32.077921827 +0000 UTC m=+365.911809755" Apr 17 21:42:33.064029 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:33.063997 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-4z57l" Apr 17 21:42:34.028888 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:34.028852 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-pp4ln" Apr 17 21:42:41.056336 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:41.056297 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-4tmcs" Apr 17 21:42:49.009579 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:42:49.009552 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-xblx8" Apr 17 21:43:44.969053 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:44.969017 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz"] Apr 17 21:43:44.972451 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:44.972431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" Apr 17 21:43:44.975001 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:44.974970 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 21:43:44.975130 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:44.974972 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 21:43:44.975966 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:44.975943 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-wzn5f\"" Apr 17 21:43:44.979612 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:44.979317 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz"] Apr 17 21:43:45.057104 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:45.057037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkxb8\" (UniqueName: \"kubernetes.io/projected/2d83a924-82e9-494a-a948-95f6e4716ec7-kube-api-access-fkxb8\") pod \"limitador-operator-controller-manager-85c4996f8c-vm6xz\" (UID: \"2d83a924-82e9-494a-a948-95f6e4716ec7\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" Apr 17 21:43:45.158478 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:45.158437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkxb8\" (UniqueName: \"kubernetes.io/projected/2d83a924-82e9-494a-a948-95f6e4716ec7-kube-api-access-fkxb8\") pod \"limitador-operator-controller-manager-85c4996f8c-vm6xz\" (UID: \"2d83a924-82e9-494a-a948-95f6e4716ec7\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" Apr 17 21:43:45.170147 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:45.170120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkxb8\" (UniqueName: \"kubernetes.io/projected/2d83a924-82e9-494a-a948-95f6e4716ec7-kube-api-access-fkxb8\") pod \"limitador-operator-controller-manager-85c4996f8c-vm6xz\" (UID: \"2d83a924-82e9-494a-a948-95f6e4716ec7\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" Apr 17 21:43:45.283283 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:45.283186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" Apr 17 21:43:45.412487 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:45.412459 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz"] Apr 17 21:43:45.415154 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:43:45.415127 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d83a924_82e9_494a_a948_95f6e4716ec7.slice/crio-5c6cb689894dcdbd8bc143097be4bbcc60ff811000f8b5842656a299bed7f4df WatchSource:0}: Error finding container 5c6cb689894dcdbd8bc143097be4bbcc60ff811000f8b5842656a299bed7f4df: Status 404 returned error can't find the container with id 5c6cb689894dcdbd8bc143097be4bbcc60ff811000f8b5842656a299bed7f4df Apr 17 21:43:46.317320 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:46.317241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" event={"ID":"2d83a924-82e9-494a-a948-95f6e4716ec7","Type":"ContainerStarted","Data":"5c6cb689894dcdbd8bc143097be4bbcc60ff811000f8b5842656a299bed7f4df"} Apr 17 21:43:48.325875 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:48.325840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" event={"ID":"2d83a924-82e9-494a-a948-95f6e4716ec7","Type":"ContainerStarted","Data":"bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900"} Apr 17 21:43:48.326272 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:48.325974 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" Apr 17 21:43:48.343135 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:48.343059 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" podStartSLOduration=2.346353433 podStartE2EDuration="4.343044485s" podCreationTimestamp="2026-04-17 21:43:44 +0000 UTC" firstStartedPulling="2026-04-17 21:43:45.417407684 +0000 UTC m=+439.251295593" lastFinishedPulling="2026-04-17 21:43:47.414098726 +0000 UTC m=+441.247986645" observedRunningTime="2026-04-17 21:43:48.341623303 +0000 UTC m=+442.175511231" watchObservedRunningTime="2026-04-17 21:43:48.343044485 +0000 UTC m=+442.176932412" Apr 17 21:43:59.331584 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:43:59.331551 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" Apr 17 21:44:00.879583 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.879552 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz"] Apr 17 21:44:00.879994 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.879766 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" podUID="2d83a924-82e9-494a-a948-95f6e4716ec7" containerName="manager" containerID="cri-o://bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900" gracePeriod=2 Apr 17 21:44:00.892658 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.892626 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz"] Apr 17 21:44:00.900358 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.900330 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n"] Apr 17 21:44:00.900664 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.900652 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d83a924-82e9-494a-a948-95f6e4716ec7" containerName="manager" Apr 17 21:44:00.900705 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.900666 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d83a924-82e9-494a-a948-95f6e4716ec7" containerName="manager" Apr 17 21:44:00.900742 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.900728 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d83a924-82e9-494a-a948-95f6e4716ec7" containerName="manager" Apr 17 21:44:00.904496 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.904471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n" Apr 17 21:44:00.906724 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.906696 2576 status_manager.go:895] "Failed to get status for pod" podUID="2d83a924-82e9-494a-a948-95f6e4716ec7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" err="pods \"limitador-operator-controller-manager-85c4996f8c-vm6xz\" is forbidden: User \"system:node:ip-10-0-141-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-47.ec2.internal' and this object" Apr 17 21:44:00.921094 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.916958 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n"] Apr 17 21:44:00.992655 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:00.992622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl8tw\" (UniqueName: \"kubernetes.io/projected/edc3476e-a908-46cd-b52f-a4e3dea16582-kube-api-access-fl8tw\") pod \"limitador-operator-controller-manager-85c4996f8c-7lc7n\" (UID: \"edc3476e-a908-46cd-b52f-a4e3dea16582\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n" Apr 17 21:44:01.093287 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.093251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl8tw\" (UniqueName: \"kubernetes.io/projected/edc3476e-a908-46cd-b52f-a4e3dea16582-kube-api-access-fl8tw\") pod \"limitador-operator-controller-manager-85c4996f8c-7lc7n\" (UID: \"edc3476e-a908-46cd-b52f-a4e3dea16582\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n" Apr 17 21:44:01.108209 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.108178 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl8tw\" (UniqueName: \"kubernetes.io/projected/edc3476e-a908-46cd-b52f-a4e3dea16582-kube-api-access-fl8tw\") pod \"limitador-operator-controller-manager-85c4996f8c-7lc7n\" (UID: \"edc3476e-a908-46cd-b52f-a4e3dea16582\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n" Apr 17 21:44:01.117225 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.117201 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" Apr 17 21:44:01.119621 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.119591 2576 status_manager.go:895] "Failed to get status for pod" podUID="2d83a924-82e9-494a-a948-95f6e4716ec7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" err="pods \"limitador-operator-controller-manager-85c4996f8c-vm6xz\" is forbidden: User \"system:node:ip-10-0-141-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-47.ec2.internal' and this object" Apr 17 21:44:01.193737 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.193645 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkxb8\" (UniqueName: \"kubernetes.io/projected/2d83a924-82e9-494a-a948-95f6e4716ec7-kube-api-access-fkxb8\") pod \"2d83a924-82e9-494a-a948-95f6e4716ec7\" (UID: \"2d83a924-82e9-494a-a948-95f6e4716ec7\") " Apr 17 21:44:01.195792 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.195763 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d83a924-82e9-494a-a948-95f6e4716ec7-kube-api-access-fkxb8" (OuterVolumeSpecName: "kube-api-access-fkxb8") pod "2d83a924-82e9-494a-a948-95f6e4716ec7" (UID: "2d83a924-82e9-494a-a948-95f6e4716ec7"). InnerVolumeSpecName "kube-api-access-fkxb8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:44:01.253816 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.253776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n" Apr 17 21:44:01.294674 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.294637 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fkxb8\" (UniqueName: \"kubernetes.io/projected/2d83a924-82e9-494a-a948-95f6e4716ec7-kube-api-access-fkxb8\") on node \"ip-10-0-141-47.ec2.internal\" DevicePath \"\"" Apr 17 21:44:01.371559 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.371520 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d83a924-82e9-494a-a948-95f6e4716ec7" containerID="bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900" exitCode=0 Apr 17 21:44:01.371748 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.371579 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" Apr 17 21:44:01.371748 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.371619 2576 scope.go:117] "RemoveContainer" containerID="bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900" Apr 17 21:44:01.373932 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.373903 2576 status_manager.go:895] "Failed to get status for pod" podUID="2d83a924-82e9-494a-a948-95f6e4716ec7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" err="pods \"limitador-operator-controller-manager-85c4996f8c-vm6xz\" is forbidden: User \"system:node:ip-10-0-141-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-47.ec2.internal' and this object" Apr 17 21:44:01.380393 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.380345 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n"] Apr 17 21:44:01.381297 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.381278 2576 scope.go:117] "RemoveContainer" containerID="bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900" Apr 17 21:44:01.381599 ip-10-0-141-47 kubenswrapper[2576]: E0417 21:44:01.381579 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900\": container with ID starting with bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900 not found: ID does not exist" containerID="bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900" Apr 17 21:44:01.381670 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.381611 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900"} err="failed to get container status \"bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900\": rpc error: code = NotFound desc = could not find container \"bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900\": container with ID starting with bebbf4aa42f6acfd699faa68238c5bb938df4f141b01dd14fb176f7f53981900 not found: ID does not exist" Apr 17 21:44:01.382800 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:01.382776 2576 status_manager.go:895] "Failed to get status for pod" podUID="2d83a924-82e9-494a-a948-95f6e4716ec7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" err="pods \"limitador-operator-controller-manager-85c4996f8c-vm6xz\" is forbidden: User \"system:node:ip-10-0-141-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-47.ec2.internal' and this object" Apr 17 21:44:01.383044 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:44:01.383024 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedc3476e_a908_46cd_b52f_a4e3dea16582.slice/crio-ed8935439f81c21076cb6b6ef539ed1d526ce27655e2b42bc7b73718ecb50e5d WatchSource:0}: Error finding container ed8935439f81c21076cb6b6ef539ed1d526ce27655e2b42bc7b73718ecb50e5d: Status 404 returned error can't find the container with id ed8935439f81c21076cb6b6ef539ed1d526ce27655e2b42bc7b73718ecb50e5d Apr 17 21:44:02.377635 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:02.377593 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n" event={"ID":"edc3476e-a908-46cd-b52f-a4e3dea16582","Type":"ContainerStarted","Data":"77bb6ba1da0303d49a1017e40737ddfb3774cfb22fc7278d2deabad2bc423fcd"} Apr 17 21:44:02.378128 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:02.377640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n" event={"ID":"edc3476e-a908-46cd-b52f-a4e3dea16582","Type":"ContainerStarted","Data":"ed8935439f81c21076cb6b6ef539ed1d526ce27655e2b42bc7b73718ecb50e5d"} Apr 17 21:44:02.378128 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:02.377672 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n" Apr 17 21:44:02.379916 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:02.379888 2576 status_manager.go:895] "Failed to get status for pod" podUID="2d83a924-82e9-494a-a948-95f6e4716ec7" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vm6xz" err="pods \"limitador-operator-controller-manager-85c4996f8c-vm6xz\" is forbidden: User \"system:node:ip-10-0-141-47.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-47.ec2.internal' and this object" Apr 17 21:44:02.400152 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:02.400070 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n" podStartSLOduration=2.400053353 podStartE2EDuration="2.400053353s" podCreationTimestamp="2026-04-17 21:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:44:02.399393875 +0000 UTC m=+456.233281828" watchObservedRunningTime="2026-04-17 21:44:02.400053353 +0000 UTC m=+456.233941281" Apr 17 21:44:02.745371 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:02.745293 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d83a924-82e9-494a-a948-95f6e4716ec7" path="/var/lib/kubelet/pods/2d83a924-82e9-494a-a948-95f6e4716ec7/volumes" Apr 17 21:44:13.384278 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:13.384247 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7lc7n" Apr 17 21:44:52.634912 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.634873 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:44:52.638476 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.638454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" Apr 17 21:44:52.640834 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.640812 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 21:44:52.640962 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.640812 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qh6pg\"" Apr 17 21:44:52.644905 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.644879 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:44:52.667767 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.667736 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:44:52.750092 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.750053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7622d\" (UniqueName: \"kubernetes.io/projected/cb73c7a3-25d3-4e4f-b519-fa6cba657b7a-kube-api-access-7622d\") pod \"limitador-limitador-78c99df468-rd6lc\" (UID: \"cb73c7a3-25d3-4e4f-b519-fa6cba657b7a\") " pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" Apr 17 21:44:52.750276 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.750120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cb73c7a3-25d3-4e4f-b519-fa6cba657b7a-config-file\") pod \"limitador-limitador-78c99df468-rd6lc\" (UID: \"cb73c7a3-25d3-4e4f-b519-fa6cba657b7a\") " pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" Apr 17 21:44:52.850689 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.850656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7622d\" (UniqueName: \"kubernetes.io/projected/cb73c7a3-25d3-4e4f-b519-fa6cba657b7a-kube-api-access-7622d\") pod \"limitador-limitador-78c99df468-rd6lc\" (UID: \"cb73c7a3-25d3-4e4f-b519-fa6cba657b7a\") " pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" Apr 17 21:44:52.850881 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.850698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cb73c7a3-25d3-4e4f-b519-fa6cba657b7a-config-file\") pod \"limitador-limitador-78c99df468-rd6lc\" (UID: \"cb73c7a3-25d3-4e4f-b519-fa6cba657b7a\") " pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" Apr 17 21:44:52.851346 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.851326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cb73c7a3-25d3-4e4f-b519-fa6cba657b7a-config-file\") pod \"limitador-limitador-78c99df468-rd6lc\" (UID: \"cb73c7a3-25d3-4e4f-b519-fa6cba657b7a\") " pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" Apr 17 21:44:52.858969 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.858943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7622d\" (UniqueName: \"kubernetes.io/projected/cb73c7a3-25d3-4e4f-b519-fa6cba657b7a-kube-api-access-7622d\") pod \"limitador-limitador-78c99df468-rd6lc\" (UID: \"cb73c7a3-25d3-4e4f-b519-fa6cba657b7a\") " pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" Apr 17 21:44:52.950346 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:52.950252 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" Apr 17 21:44:53.082337 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:53.082297 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:44:53.085707 ip-10-0-141-47 kubenswrapper[2576]: W0417 21:44:53.085673 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb73c7a3_25d3_4e4f_b519_fa6cba657b7a.slice/crio-2e0767634f86ff56f76f7674ebee1fb2ac3e4febedba7984d4bd8fee6b3003f8 WatchSource:0}: Error finding container 2e0767634f86ff56f76f7674ebee1fb2ac3e4febedba7984d4bd8fee6b3003f8: Status 404 returned error can't find the container with id 2e0767634f86ff56f76f7674ebee1fb2ac3e4febedba7984d4bd8fee6b3003f8 Apr 17 21:44:53.562167 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:53.562117 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" event={"ID":"cb73c7a3-25d3-4e4f-b519-fa6cba657b7a","Type":"ContainerStarted","Data":"2e0767634f86ff56f76f7674ebee1fb2ac3e4febedba7984d4bd8fee6b3003f8"} Apr 17 21:44:56.574839 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:56.574796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" event={"ID":"cb73c7a3-25d3-4e4f-b519-fa6cba657b7a","Type":"ContainerStarted","Data":"844ac233c2bae564b55f5ba29892495c8305fff8cd3562bd5820cd6329931a45"} Apr 17 21:44:56.575270 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:56.574918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" Apr 17 21:44:56.591089 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:44:56.591026 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" podStartSLOduration=1.987635995 podStartE2EDuration="4.591010484s" podCreationTimestamp="2026-04-17 21:44:52 +0000 UTC" firstStartedPulling="2026-04-17 21:44:53.088047811 +0000 UTC m=+506.921935717" lastFinishedPulling="2026-04-17 21:44:55.691422298 +0000 UTC m=+509.525310206" observedRunningTime="2026-04-17 21:44:56.589339558 +0000 UTC m=+510.423227476" watchObservedRunningTime="2026-04-17 21:44:56.591010484 +0000 UTC m=+510.424898450" Apr 17 21:45:07.579626 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:45:07.579596 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-rd6lc" Apr 17 21:45:51.768706 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:45:51.768664 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:46:04.065684 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:46:04.065593 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:46:26.713485 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:46:26.713453 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:46:26.714014 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:46:26.713660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:46:26.842226 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:46:26.842186 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:46:39.050886 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:46:39.050851 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:46:50.551352 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:46:50.551311 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:47:12.746368 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:47:12.746328 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:47:19.342831 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:47:19.342790 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:47:58.842313 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:47:58.842269 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:48:03.441298 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:48:03.441261 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:48:09.538031 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:48:09.537996 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:48:20.838774 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:48:20.838691 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:48:29.842141 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:48:29.842103 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:48:39.041678 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:48:39.041638 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:48:48.442418 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:48:48.442376 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:48:58.853357 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:48:58.853320 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:50:01.044835 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:50:01.044801 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:50:16.039734 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:50:16.039695 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:50:54.042543 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:50:54.042504 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:51:10.537762 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:51:10.537725 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:51:24.947133 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:51:24.947034 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:51:26.736008 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:51:26.735982 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:51:26.738189 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:51:26.738164 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:51:41.944610 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:51:41.944573 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:52:37.739028 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:52:37.738991 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:52:46.041930 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:52:46.041839 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:53:03.241487 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:53:03.241444 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:53:12.043944 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:53:12.043899 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:53:28.550526 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:53:28.550484 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:53:36.745989 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:53:36.745956 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:54:09.436983 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:54:09.436947 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:54:18.341323 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:54:18.341238 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:54:26.638858 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:54:26.638814 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:54:35.036480 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:54:35.036444 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:54:43.232993 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:54:43.232949 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:55:00.338794 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:55:00.338757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:55:13.346557 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:55:13.346517 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:55:59.443752 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:55:59.443712 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:56:26.760886 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:56:26.760855 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:56:26.764155 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:56:26.764133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 21:57:40.242472 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:57:40.242434 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:57:48.937152 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:57:48.937111 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:57:57.242630 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:57:57.242593 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:58:06.146248 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:58:06.146207 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:58:14.439301 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:58:14.439263 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 21:58:23.443367 ip-10-0-141-47 kubenswrapper[2576]: I0417 21:58:23.443327 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:00:00.144746 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.144710 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607720-8gdwb"] Apr 17 22:00:00.147964 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.147943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" Apr 17 22:00:00.150596 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.150565 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-lpm9t\"" Apr 17 22:00:00.160781 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.160752 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607720-8gdwb"] Apr 17 22:00:00.195519 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.195482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jndqn\" (UniqueName: \"kubernetes.io/projected/f4d54a82-932a-44dd-be22-ac067be9f8d8-kube-api-access-jndqn\") pod \"maas-api-key-cleanup-29607720-8gdwb\" (UID: \"f4d54a82-932a-44dd-be22-ac067be9f8d8\") " pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" Apr 17 22:00:00.296863 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.296828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jndqn\" (UniqueName: \"kubernetes.io/projected/f4d54a82-932a-44dd-be22-ac067be9f8d8-kube-api-access-jndqn\") pod \"maas-api-key-cleanup-29607720-8gdwb\" (UID: \"f4d54a82-932a-44dd-be22-ac067be9f8d8\") " pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" Apr 17 22:00:00.307756 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.307730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jndqn\" (UniqueName: \"kubernetes.io/projected/f4d54a82-932a-44dd-be22-ac067be9f8d8-kube-api-access-jndqn\") pod \"maas-api-key-cleanup-29607720-8gdwb\" (UID: \"f4d54a82-932a-44dd-be22-ac067be9f8d8\") " pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" Apr 17 22:00:00.458153 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.458037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" Apr 17 22:00:00.579544 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.579510 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607720-8gdwb"] Apr 17 22:00:00.582302 ip-10-0-141-47 kubenswrapper[2576]: W0417 22:00:00.582273 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4d54a82_932a_44dd_be22_ac067be9f8d8.slice/crio-5d530a630c0fc5911c18b6ac6eb44e4c4aea9853020035405297fd69338cc52d WatchSource:0}: Error finding container 5d530a630c0fc5911c18b6ac6eb44e4c4aea9853020035405297fd69338cc52d: Status 404 returned error can't find the container with id 5d530a630c0fc5911c18b6ac6eb44e4c4aea9853020035405297fd69338cc52d Apr 17 22:00:00.584576 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.584558 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 22:00:00.665495 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:00.665460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" event={"ID":"f4d54a82-932a-44dd-be22-ac067be9f8d8","Type":"ContainerStarted","Data":"5d530a630c0fc5911c18b6ac6eb44e4c4aea9853020035405297fd69338cc52d"} Apr 17 22:00:03.680899 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:03.680864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" event={"ID":"f4d54a82-932a-44dd-be22-ac067be9f8d8","Type":"ContainerStarted","Data":"3f9480b6edb659aa0a8562587651be2a8e510dbeef77777c04751d99ec048481"} Apr 17 22:00:03.695734 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:03.695674 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" podStartSLOduration=1.396294838 podStartE2EDuration="3.695657075s" podCreationTimestamp="2026-04-17 22:00:00 +0000 UTC" firstStartedPulling="2026-04-17 22:00:00.584686457 +0000 UTC m=+1414.418574363" lastFinishedPulling="2026-04-17 22:00:02.884048685 +0000 UTC m=+1416.717936600" observedRunningTime="2026-04-17 22:00:03.694174693 +0000 UTC m=+1417.528062631" watchObservedRunningTime="2026-04-17 22:00:03.695657075 +0000 UTC m=+1417.529545002" Apr 17 22:00:03.744386 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:03.741304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:00:13.637052 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:13.637015 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:00:23.751708 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:23.751672 2576 generic.go:358] "Generic (PLEG): container finished" podID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerID="3f9480b6edb659aa0a8562587651be2a8e510dbeef77777c04751d99ec048481" exitCode=6 Apr 17 22:00:23.752138 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:23.751748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" event={"ID":"f4d54a82-932a-44dd-be22-ac067be9f8d8","Type":"ContainerDied","Data":"3f9480b6edb659aa0a8562587651be2a8e510dbeef77777c04751d99ec048481"} Apr 17 22:00:23.752138 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:23.752099 2576 scope.go:117] "RemoveContainer" containerID="3f9480b6edb659aa0a8562587651be2a8e510dbeef77777c04751d99ec048481" Apr 17 22:00:24.756319 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:24.756274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" event={"ID":"f4d54a82-932a-44dd-be22-ac067be9f8d8","Type":"ContainerStarted","Data":"11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2"} Apr 17 22:00:44.829116 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:44.829013 2576 generic.go:358] "Generic (PLEG): container finished" podID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerID="11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2" exitCode=6 Apr 17 22:00:44.829683 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:44.829104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" event={"ID":"f4d54a82-932a-44dd-be22-ac067be9f8d8","Type":"ContainerDied","Data":"11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2"} Apr 17 22:00:44.829683 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:44.829156 2576 scope.go:117] "RemoveContainer" containerID="3f9480b6edb659aa0a8562587651be2a8e510dbeef77777c04751d99ec048481" Apr 17 22:00:44.829683 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:44.829511 2576 scope.go:117] "RemoveContainer" containerID="11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2" Apr 17 22:00:44.829851 ip-10-0-141-47 kubenswrapper[2576]: E0417 22:00:44.829768 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607720-8gdwb_opendatahub(f4d54a82-932a-44dd-be22-ac067be9f8d8)\"" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" Apr 17 22:00:59.741436 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:00:59.741407 2576 scope.go:117] "RemoveContainer" containerID="11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2" Apr 17 22:01:00.010090 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:00.010045 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607720-8gdwb"] Apr 17 22:01:00.883753 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:00.883715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" event={"ID":"f4d54a82-932a-44dd-be22-ac067be9f8d8","Type":"ContainerStarted","Data":"01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97"} Apr 17 22:01:00.884197 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:00.883793 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerName="cleanup" containerID="cri-o://01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97" gracePeriod=30 Apr 17 22:01:20.624588 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.624562 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" Apr 17 22:01:20.680794 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.680710 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jndqn\" (UniqueName: \"kubernetes.io/projected/f4d54a82-932a-44dd-be22-ac067be9f8d8-kube-api-access-jndqn\") pod \"f4d54a82-932a-44dd-be22-ac067be9f8d8\" (UID: \"f4d54a82-932a-44dd-be22-ac067be9f8d8\") " Apr 17 22:01:20.683018 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.682984 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d54a82-932a-44dd-be22-ac067be9f8d8-kube-api-access-jndqn" (OuterVolumeSpecName: "kube-api-access-jndqn") pod "f4d54a82-932a-44dd-be22-ac067be9f8d8" (UID: "f4d54a82-932a-44dd-be22-ac067be9f8d8"). InnerVolumeSpecName "kube-api-access-jndqn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 22:01:20.781490 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.781455 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jndqn\" (UniqueName: \"kubernetes.io/projected/f4d54a82-932a-44dd-be22-ac067be9f8d8-kube-api-access-jndqn\") on node \"ip-10-0-141-47.ec2.internal\" DevicePath \"\"" Apr 17 22:01:20.950834 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.950750 2576 generic.go:358] "Generic (PLEG): container finished" podID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerID="01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97" exitCode=6 Apr 17 22:01:20.950985 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.950833 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" Apr 17 22:01:20.950985 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.950833 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" event={"ID":"f4d54a82-932a-44dd-be22-ac067be9f8d8","Type":"ContainerDied","Data":"01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97"} Apr 17 22:01:20.950985 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.950943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607720-8gdwb" event={"ID":"f4d54a82-932a-44dd-be22-ac067be9f8d8","Type":"ContainerDied","Data":"5d530a630c0fc5911c18b6ac6eb44e4c4aea9853020035405297fd69338cc52d"} Apr 17 22:01:20.950985 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.950963 2576 scope.go:117] "RemoveContainer" containerID="01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97" Apr 17 22:01:20.959400 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.959372 2576 scope.go:117] "RemoveContainer" containerID="11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2" Apr 17 22:01:20.966851 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.966800 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607720-8gdwb"] Apr 17 22:01:20.968376 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.968358 2576 scope.go:117] "RemoveContainer" containerID="01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97" Apr 17 22:01:20.968656 ip-10-0-141-47 kubenswrapper[2576]: E0417 22:01:20.968634 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97\": container with ID starting with 01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97 not found: ID does not exist" containerID="01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97" Apr 17 22:01:20.968709 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.968670 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97"} err="failed to get container status \"01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97\": rpc error: code = NotFound desc = could not find container \"01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97\": container with ID starting with 01090eebe5a16b7eff3c204968663d2ce8fdeb6da4d9ceb8b0630497798a2c97 not found: ID does not exist" Apr 17 22:01:20.968709 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.968697 2576 scope.go:117] "RemoveContainer" containerID="11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2" Apr 17 22:01:20.968939 ip-10-0-141-47 kubenswrapper[2576]: E0417 22:01:20.968922 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2\": container with ID starting with 11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2 not found: ID does not exist" containerID="11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2" Apr 17 22:01:20.968991 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.968946 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2"} err="failed to get container status \"11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2\": rpc error: code = NotFound desc = could not find container \"11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2\": container with ID starting with 11714ba60ca20a69b36d5f5f9f9a50421ea0154edadca0bf0cfbd2e84318bdb2 not found: ID does not exist" Apr 17 22:01:20.969037 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:20.968999 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607720-8gdwb"] Apr 17 22:01:22.745465 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:22.745430 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" path="/var/lib/kubelet/pods/f4d54a82-932a-44dd-be22-ac067be9f8d8/volumes" Apr 17 22:01:26.784557 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:26.784532 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 22:01:26.788423 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:26.788397 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 22:01:52.539955 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:01:52.539916 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:02:02.340169 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:02:02.340131 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:02:10.338578 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:02:10.338543 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:02:19.244036 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:02:19.243995 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:02:27.840272 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:02:27.840231 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:02:36.545453 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:02:36.545412 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:02:45.138201 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:02:45.138166 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:02:54.341767 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:02:54.341722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:03:01.738348 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:03:01.738315 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:05:17.880661 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:05:17.880617 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:05:25.678775 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:05:25.678741 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:05:51.073404 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:05:51.073367 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:05:57.675817 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:05:57.675775 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:06:07.076068 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:07.076034 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:06:17.777419 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:17.777324 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-rd6lc"] Apr 17 22:06:26.808879 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:26.808845 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 22:06:26.816124 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:26.816094 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 22:06:28.211448 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:28.211412 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-xblx8_0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec/manager/0.log" Apr 17 22:06:28.549514 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:28.549435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-4tmcs_4a69880f-4007-485f-88af-595348fe4dab/manager/2.log" Apr 17 22:06:28.662824 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:28.662793 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6bfddf7b9f-46h88_9b8ea2aa-bb66-4601-8120-6418a6c3d99b/manager/0.log" Apr 17 22:06:30.888874 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:30.888847 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-rd6lc_cb73c7a3-25d3-4e4f-b519-fa6cba657b7a/limitador/0.log" Apr 17 22:06:30.996986 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:30.996956 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-7lc7n_edc3476e-a908-46cd-b52f-a4e3dea16582/manager/0.log" Apr 17 22:06:31.448595 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:31.448568 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4z57l_44fed5ef-59c3-44b3-81d4-e05bb6339307/discovery/0.log" Apr 17 22:06:31.661305 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:31.661272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-79c9d94f8f-6r4gq_c0ac529e-2275-450c-89eb-219a1b2af810/kube-auth-proxy/0.log" Apr 17 22:06:31.888743 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:31.888715 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7646cfc968-5h87d_26507fb0-1d13-41d7-b18c-dfbaf034b8e0/router/0.log" Apr 17 22:06:39.533701 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:39.533667 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rkkmx_f3915b71-644a-48b2-a22a-a629db33eec4/global-pull-secret-syncer/0.log" Apr 17 22:06:39.595506 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:39.595473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bhgcd_8e6dcb6f-5562-4043-bca2-51c52997f079/konnectivity-agent/0.log" Apr 17 22:06:39.681336 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:39.681304 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-47.ec2.internal_243afe9ca8e3caca522246c35745403e/haproxy/0.log" Apr 17 22:06:43.454429 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:43.454336 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-rd6lc_cb73c7a3-25d3-4e4f-b519-fa6cba657b7a/limitador/0.log" Apr 17 22:06:43.496901 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:43.496864 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-7lc7n_edc3476e-a908-46cd-b52f-a4e3dea16582/manager/0.log" Apr 17 22:06:45.016754 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:45.016720 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-vmb5v_538ad283-f697-4eb8-b901-99763f2b9340/cluster-monitoring-operator/0.log" Apr 17 22:06:45.262363 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:45.262333 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j5tkg_1f172aba-e062-4d68-a684-36ccee45d666/node-exporter/0.log" Apr 17 22:06:45.293209 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:45.293140 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j5tkg_1f172aba-e062-4d68-a684-36ccee45d666/kube-rbac-proxy/0.log" Apr 17 22:06:45.311832 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:45.311805 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j5tkg_1f172aba-e062-4d68-a684-36ccee45d666/init-textfile/0.log" Apr 17 22:06:47.847028 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:47.846999 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/1.log" Apr 17 22:06:47.853490 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:47.853468 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-h57tx_152480c2-ecf4-4eab-a6b2-3f71ca86e6c0/console-operator/2.log" Apr 17 22:06:48.625787 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.625753 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf"] Apr 17 22:06:48.626129 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.626116 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerName="cleanup" Apr 17 22:06:48.626179 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.626131 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerName="cleanup" Apr 17 22:06:48.626179 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.626140 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerName="cleanup" Apr 17 22:06:48.626179 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.626145 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerName="cleanup" Apr 17 22:06:48.626179 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.626153 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerName="cleanup" Apr 17 22:06:48.626179 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.626158 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerName="cleanup" Apr 17 22:06:48.626329 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.626215 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerName="cleanup" Apr 17 22:06:48.626329 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.626225 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerName="cleanup" Apr 17 22:06:48.626329 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.626231 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4d54a82-932a-44dd-be22-ac067be9f8d8" containerName="cleanup" Apr 17 22:06:48.629210 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.629193 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.631996 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.631941 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tfkqt\"/\"default-dockercfg-km98p\"" Apr 17 22:06:48.631996 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.631978 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tfkqt\"/\"kube-root-ca.crt\"" Apr 17 22:06:48.631996 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.631988 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tfkqt\"/\"openshift-service-ca.crt\"" Apr 17 22:06:48.636989 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.636966 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf"] Apr 17 22:06:48.794029 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.793990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-sys\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.794029 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.794028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-lib-modules\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.794285 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.794152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-proc\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.794285 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.794193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5z8v\" (UniqueName: \"kubernetes.io/projected/0517de97-8fca-4e8a-81ba-595673b86c39-kube-api-access-l5z8v\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.794285 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.794239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-podres\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.895237 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.895132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-sys\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.895237 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.895175 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-lib-modules\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.895237 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.895236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-proc\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.895795 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.895233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-sys\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.895795 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.895265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5z8v\" (UniqueName: \"kubernetes.io/projected/0517de97-8fca-4e8a-81ba-595673b86c39-kube-api-access-l5z8v\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.895795 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.895346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-proc\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.895795 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.895363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-podres\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.895795 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.895384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-lib-modules\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.895795 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.895652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0517de97-8fca-4e8a-81ba-595673b86c39-podres\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.896470 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.896454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-lqwtr_fa074a8e-430b-4109-87d9-1949bf0c1d86/volume-data-source-validator/0.log" Apr 17 22:06:48.902901 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.902878 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5z8v\" (UniqueName: \"kubernetes.io/projected/0517de97-8fca-4e8a-81ba-595673b86c39-kube-api-access-l5z8v\") pod \"perf-node-gather-daemonset-7hrqf\" (UID: \"0517de97-8fca-4e8a-81ba-595673b86c39\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:48.940151 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:48.940115 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:49.063969 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:49.063834 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf"] Apr 17 22:06:49.066857 ip-10-0-141-47 kubenswrapper[2576]: W0417 22:06:49.066825 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0517de97_8fca_4e8a_81ba_595673b86c39.slice/crio-775be5b33332020906552c1d9a77a1f75cb4c33826478ff933bde21feccb5140 WatchSource:0}: Error finding container 775be5b33332020906552c1d9a77a1f75cb4c33826478ff933bde21feccb5140: Status 404 returned error can't find the container with id 775be5b33332020906552c1d9a77a1f75cb4c33826478ff933bde21feccb5140 Apr 17 22:06:49.068505 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:49.068489 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 22:06:49.089592 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:49.089559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" event={"ID":"0517de97-8fca-4e8a-81ba-595673b86c39","Type":"ContainerStarted","Data":"775be5b33332020906552c1d9a77a1f75cb4c33826478ff933bde21feccb5140"} Apr 17 22:06:49.780137 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:49.780107 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lcgfs_da349b59-9fbe-4add-9ba3-8270d5731310/dns/0.log" Apr 17 22:06:49.798489 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:49.798456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lcgfs_da349b59-9fbe-4add-9ba3-8270d5731310/kube-rbac-proxy/0.log" Apr 17 22:06:49.860699 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:49.860670 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gjnvl_3d6a476e-116f-4845-8146-b9bc2af9d504/dns-node-resolver/0.log" Apr 17 22:06:50.097837 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:50.097802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" event={"ID":"0517de97-8fca-4e8a-81ba-595673b86c39","Type":"ContainerStarted","Data":"6a8017fbee24a9b5505bf8c7caa75dbc81d54289a13615ce0a4d614dec261f3d"} Apr 17 22:06:50.098240 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:50.097894 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:06:50.113397 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:50.113349 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" podStartSLOduration=2.113335188 podStartE2EDuration="2.113335188s" podCreationTimestamp="2026-04-17 22:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 22:06:50.112286336 +0000 UTC m=+1823.946174265" watchObservedRunningTime="2026-04-17 22:06:50.113335188 +0000 UTC m=+1823.947223110" Apr 17 22:06:50.434188 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:50.434090 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4jhjv_02310a72-3db5-42e8-b257-0ccf87bb8deb/node-ca/0.log" Apr 17 22:06:51.367468 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:51.367440 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-4z57l_44fed5ef-59c3-44b3-81d4-e05bb6339307/discovery/0.log" Apr 17 22:06:51.407110 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:51.407062 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-79c9d94f8f-6r4gq_c0ac529e-2275-450c-89eb-219a1b2af810/kube-auth-proxy/0.log" Apr 17 22:06:51.483443 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:51.483416 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7646cfc968-5h87d_26507fb0-1d13-41d7-b18c-dfbaf034b8e0/router/0.log" Apr 17 22:06:51.976343 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:51.976298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8wmk5_6a2f302e-4951-4581-a1e6-f71a43573912/serve-healthcheck-canary/0.log" Apr 17 22:06:52.455587 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:52.455558 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-625wr_cd17cea4-4ab5-4023-b0a8-ebd4db6056d6/insights-operator/0.log" Apr 17 22:06:52.456505 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:52.456484 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-625wr_cd17cea4-4ab5-4023-b0a8-ebd4db6056d6/insights-operator/1.log" Apr 17 22:06:52.541174 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:52.541143 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-stw9p_4a70ef53-ac44-46a4-aa6d-728b21d6154e/kube-rbac-proxy/0.log" Apr 17 22:06:52.560807 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:52.560782 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-stw9p_4a70ef53-ac44-46a4-aa6d-728b21d6154e/exporter/0.log" Apr 17 22:06:52.582250 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:52.582210 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-stw9p_4a70ef53-ac44-46a4-aa6d-728b21d6154e/extractor/0.log" Apr 17 22:06:54.515009 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:54.514973 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-xblx8_0cfa0f82-7fd4-413b-b0e8-1ad64764a7ec/manager/0.log" Apr 17 22:06:54.605148 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:54.605107 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-4tmcs_4a69880f-4007-485f-88af-595348fe4dab/manager/1.log" Apr 17 22:06:54.616514 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:54.616472 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-4tmcs_4a69880f-4007-485f-88af-595348fe4dab/manager/2.log" Apr 17 22:06:54.657965 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:54.657937 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6bfddf7b9f-46h88_9b8ea2aa-bb66-4601-8120-6418a6c3d99b/manager/0.log" Apr 17 22:06:56.113261 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:06:56.113233 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-7hrqf" Apr 17 22:07:00.414400 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:00.414303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wl4bs_2f23fadd-370c-4137-9a4f-d3ee5b830a79/migrator/0.log" Apr 17 22:07:00.432110 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:00.432064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wl4bs_2f23fadd-370c-4137-9a4f-d3ee5b830a79/graceful-termination/0.log" Apr 17 22:07:00.771345 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:00.771235 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qwfp5_4ee90e9e-74f6-4a93-b52b-9e82c5789ae2/kube-storage-version-migrator-operator/1.log" Apr 17 22:07:00.772200 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:00.772180 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qwfp5_4ee90e9e-74f6-4a93-b52b-9e82c5789ae2/kube-storage-version-migrator-operator/0.log" Apr 17 22:07:02.136141 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:02.136104 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b5rqj_996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e/kube-multus-additional-cni-plugins/0.log" Apr 17 22:07:02.155631 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:02.155595 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b5rqj_996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e/egress-router-binary-copy/0.log" Apr 17 22:07:02.175217 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:02.175189 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b5rqj_996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e/cni-plugins/0.log" Apr 17 22:07:02.197590 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:02.197565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b5rqj_996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e/bond-cni-plugin/0.log" Apr 17 22:07:02.218101 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:02.218060 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b5rqj_996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e/routeoverride-cni/0.log" Apr 17 22:07:02.237788 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:02.237741 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b5rqj_996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e/whereabouts-cni-bincopy/0.log" Apr 17 22:07:02.256247 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:02.256219 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b5rqj_996e4d5d-ab1a-4fb7-99bd-6dd1589aac0e/whereabouts-cni/0.log" Apr 17 22:07:02.283342 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:02.283312 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kf8f6_e09621e2-cdbd-4f6e-8317-db02abbe345a/kube-multus/0.log" Apr 17 22:07:02.394461 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:02.394381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lncjj_697e918d-013d-41df-9440-059bd3d99a19/network-metrics-daemon/0.log" Apr 17 22:07:02.412448 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:02.412397 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lncjj_697e918d-013d-41df-9440-059bd3d99a19/kube-rbac-proxy/0.log" Apr 17 22:07:03.256496 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:03.256468 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bhdzs_096937b2-0789-4eb0-a35d-1a44c37d72dd/ovn-controller/0.log" Apr 17 22:07:03.279926 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:03.279896 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bhdzs_096937b2-0789-4eb0-a35d-1a44c37d72dd/ovn-acl-logging/0.log" Apr 17 22:07:03.295370 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:03.295342 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bhdzs_096937b2-0789-4eb0-a35d-1a44c37d72dd/kube-rbac-proxy-node/0.log" Apr 17 22:07:03.315198 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:03.315170 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bhdzs_096937b2-0789-4eb0-a35d-1a44c37d72dd/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 22:07:03.331644 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:03.331621 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bhdzs_096937b2-0789-4eb0-a35d-1a44c37d72dd/northd/0.log" Apr 17 22:07:03.356085 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:03.356048 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bhdzs_096937b2-0789-4eb0-a35d-1a44c37d72dd/nbdb/0.log" Apr 17 22:07:03.381116 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:03.381088 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bhdzs_096937b2-0789-4eb0-a35d-1a44c37d72dd/sbdb/0.log" Apr 17 22:07:03.475000 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:03.474967 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bhdzs_096937b2-0789-4eb0-a35d-1a44c37d72dd/ovnkube-controller/0.log" Apr 17 22:07:05.082840 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:05.082793 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-kgrg6_05cac4af-b010-4bb6-ba12-8d55e0fedc36/check-endpoints/0.log" Apr 17 22:07:05.128341 ip-10-0-141-47 kubenswrapper[2576]: I0417 22:07:05.128304 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jlszn_9a699952-32f8-4727-bb72-f047e0297d2f/network-check-target-container/0.log"