Mar 18 16:43:49.287764 ip-10-0-141-231 systemd[1]: Starting Kubernetes Kubelet... Mar 18 16:43:49.745781 ip-10-0-141-231 kubenswrapper[2563]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:49.745781 ip-10-0-141-231 kubenswrapper[2563]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 16:43:49.745781 ip-10-0-141-231 kubenswrapper[2563]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:49.745781 ip-10-0-141-231 kubenswrapper[2563]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 18 16:43:49.745781 ip-10-0-141-231 kubenswrapper[2563]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 16:43:49.748143 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.748062 2563 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 16:43:49.754422 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754402 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:49.754422 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754419 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:49.754422 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754423 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:49.754422 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754426 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:49.754422 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754429 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754432 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754435 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754437 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754440 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754443 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754446 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754449 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754451 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754454 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754456 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754459 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754461 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754464 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754467 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754469 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754472 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754474 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754477 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754480 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:49.754628 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754483 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754485 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754488 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754491 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754494 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754497 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754516 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754521 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754524 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754526 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754529 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754532 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754534 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754537 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754540 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754543 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754546 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754549 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754551 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754554 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:49.755102 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754557 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754559 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754562 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754564 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754567 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754569 2563 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754574 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754578 2563 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754581 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754584 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754587 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754590 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754593 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754596 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754598 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754601 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754603 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754606 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754609 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:49.755617 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754611 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754622 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754626 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754629 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754632 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754635 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754638 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754643 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754646 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754648 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754652 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754654 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754657 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754659 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754662 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754666 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754669 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754672 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754675 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:49.756073 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754677 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754680 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754683 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.754685 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755068 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755073 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755075 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755078 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755081 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755084 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755086 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755089 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755092 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755095 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755097 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755100 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755102 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755105 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755108 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755110 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:49.756530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755113 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755116 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755119 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755122 2563 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755125 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755128 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755130 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755133 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755136 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755139 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755142 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755144 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755147 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755150 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755152 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755154 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755157 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755160 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755162 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755165 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:49.757003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755168 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755171 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755174 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755176 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755179 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755181 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755184 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755186 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755190 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755192 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755195 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755198 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755200 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755203 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755205 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755208 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755210 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755213 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755216 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755218 2563 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:49.757530 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755221 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755224 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755226 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755229 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755231 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755234 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755237 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755240 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755242 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755245 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755248 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755250 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755253 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755255 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755257 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755260 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755262 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755265 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755267 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:49.758026 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755270 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755273 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755277 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755280 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755284 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755287 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755289 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755292 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755295 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755313 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.755317 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756485 2563 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756494 2563 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756518 2563 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756525 2563 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756531 2563 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756535 2563 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756540 2563 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756544 2563 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756547 2563 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756551 2563 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 16:43:49.758529 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756555 2563 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756558 2563 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756561 2563 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756564 2563 flags.go:64] FLAG: --cgroup-root="" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756567 2563 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756570 2563 flags.go:64] FLAG: --client-ca-file="" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756573 2563 flags.go:64] FLAG: --cloud-config="" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756576 2563 flags.go:64] FLAG: --cloud-provider="external" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756579 2563 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756583 2563 flags.go:64] FLAG: --cluster-domain="" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756586 2563 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756589 2563 flags.go:64] FLAG: --config-dir="" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756593 2563 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756596 2563 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756605 2563 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756608 2563 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756612 2563 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756615 2563 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756618 2563 flags.go:64] FLAG: --contention-profiling="false" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756621 2563 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756624 2563 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756627 2563 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756630 2563 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756634 2563 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756638 2563 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 16:43:49.759040 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756640 2563 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756643 2563 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756647 2563 flags.go:64] FLAG: --enable-server="true" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756650 2563 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756654 2563 flags.go:64] FLAG: --event-burst="100" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756657 2563 flags.go:64] FLAG: --event-qps="50" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756660 2563 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756663 2563 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756666 2563 flags.go:64] FLAG: --eviction-hard="" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756670 2563 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756673 2563 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756676 2563 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756679 2563 flags.go:64] FLAG: --eviction-soft="" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756682 2563 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756685 2563 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756688 2563 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756691 2563 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756694 2563 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756697 2563 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756700 2563 flags.go:64] FLAG: --feature-gates="" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756704 2563 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756707 2563 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756710 2563 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756714 2563 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756717 2563 flags.go:64] FLAG: --healthz-port="10248" Mar 18 16:43:49.759651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756720 2563 flags.go:64] FLAG: --help="false" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756723 2563 flags.go:64] FLAG: --hostname-override="ip-10-0-141-231.ec2.internal" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756726 2563 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756729 2563 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756733 2563 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756736 2563 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756740 2563 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756743 2563 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756746 2563 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756749 2563 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756752 2563 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756755 2563 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756759 2563 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756761 2563 flags.go:64] FLAG: --kube-reserved="" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756764 2563 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756767 2563 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756770 2563 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756773 2563 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756776 2563 flags.go:64] FLAG: --lock-file="" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756779 2563 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756782 2563 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756785 2563 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756794 2563 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 16:43:49.760275 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756797 2563 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756800 2563 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756803 2563 flags.go:64] FLAG: --logging-format="text" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756806 2563 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756810 2563 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756813 2563 flags.go:64] FLAG: --manifest-url="" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756815 2563 flags.go:64] FLAG: --manifest-url-header="" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756820 2563 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756823 2563 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756827 2563 flags.go:64] FLAG: --max-pods="110" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756830 2563 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756833 2563 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756836 2563 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756839 2563 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756842 2563 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756845 2563 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756848 2563 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756856 2563 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756859 2563 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756862 2563 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756865 2563 flags.go:64] FLAG: --pod-cidr="" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756868 2563 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756874 2563 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756877 2563 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 16:43:49.760858 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756880 2563 flags.go:64] FLAG: --pods-per-core="0" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756883 2563 flags.go:64] FLAG: --port="10250" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756886 2563 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756889 2563 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e7498c1984ad25c4" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756892 2563 flags.go:64] FLAG: --qos-reserved="" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756895 2563 flags.go:64] FLAG: --read-only-port="10255" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756898 2563 flags.go:64] FLAG: --register-node="true" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756901 2563 flags.go:64] FLAG: --register-schedulable="true" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756905 2563 flags.go:64] FLAG: --register-with-taints="" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756912 2563 flags.go:64] FLAG: --registry-burst="10" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756915 2563 flags.go:64] FLAG: --registry-qps="5" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756917 2563 flags.go:64] FLAG: --reserved-cpus="" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756920 2563 flags.go:64] FLAG: --reserved-memory="" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756924 2563 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756927 2563 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756930 2563 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756933 2563 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756936 2563 flags.go:64] FLAG: --runonce="false" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756938 2563 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756942 2563 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756944 2563 flags.go:64] FLAG: --seccomp-default="false" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756947 2563 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756950 2563 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756953 2563 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756956 2563 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756959 2563 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 16:43:49.761438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756962 2563 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756964 2563 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756967 2563 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756971 2563 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756974 2563 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756977 2563 flags.go:64] FLAG: --system-cgroups="" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756979 2563 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756985 2563 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756988 2563 flags.go:64] FLAG: --tls-cert-file="" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756991 2563 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756995 2563 flags.go:64] FLAG: --tls-min-version="" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.756998 2563 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.757000 2563 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.757003 2563 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.757008 2563 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.757011 2563 flags.go:64] FLAG: --v="2" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.757015 2563 flags.go:64] FLAG: --version="false" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.757019 2563 flags.go:64] FLAG: --vmodule="" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.757023 2563 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.757026 2563 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757135 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757139 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757142 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757145 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:49.762080 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757148 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757151 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757154 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757157 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757160 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757165 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757167 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757170 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757172 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757175 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757178 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757180 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757183 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757186 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757188 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757191 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757194 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757196 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757199 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:49.762656 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757201 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757204 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757206 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757210 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757213 2563 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757217 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757221 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757224 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757227 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757230 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757233 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757235 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757238 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757241 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757244 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757246 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757249 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757251 2563 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757255 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:49.763135 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757258 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757260 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757263 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757266 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757268 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757271 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757273 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757276 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757278 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757281 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757284 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757286 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757289 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757291 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757294 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757296 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757300 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757302 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757305 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757307 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:49.763655 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757310 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757312 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757315 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757318 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757320 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757323 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757325 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757328 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757331 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757333 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757336 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757339 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757342 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757344 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757347 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757349 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757352 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757354 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757357 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757359 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:49.764187 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757362 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:49.764733 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757366 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:49.764733 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757369 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:49.764733 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.757372 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:49.764733 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.759406 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:49.765741 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.765724 2563 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 18 16:43:49.765783 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.765742 2563 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 16:43:49.765814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765791 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:49.765814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765796 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:49.765814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765800 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:49.765814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765803 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:49.765814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765806 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:49.765814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765808 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:49.765814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765811 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:49.765814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765814 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:49.765814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765817 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:49.765814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765819 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765824 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765829 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765832 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765835 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765838 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765841 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765845 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765848 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765851 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765853 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765856 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765859 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765861 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765864 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765866 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765870 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765873 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765876 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765879 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:49.766062 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765881 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765884 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765886 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765889 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765892 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765894 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765896 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765899 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765901 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765904 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765906 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765908 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765911 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765914 2563 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765917 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765920 2563 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765923 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765925 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765928 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765940 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:49.766574 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765943 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765946 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765949 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765952 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765955 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765957 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765960 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765962 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765965 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765969 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765971 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765974 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765976 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765979 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765981 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765984 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765986 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765989 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765991 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:49.767055 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765994 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765996 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.765998 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766001 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766004 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766006 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766010 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766015 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766018 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766021 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766024 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766026 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766029 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766031 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766034 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766036 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766039 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:49.767535 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766041 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.766046 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766137 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766141 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766144 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766147 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766151 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766154 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766157 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766160 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766163 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766165 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766168 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766171 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766173 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 18 16:43:49.767947 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766176 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766178 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766181 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766184 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766187 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766189 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766192 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766195 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766197 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766200 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766203 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766206 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766209 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766211 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766214 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766216 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766219 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766221 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766224 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 18 16:43:49.768310 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766226 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766229 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766231 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766234 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766236 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766239 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766242 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766245 2563 feature_gate.go:328] unrecognized feature gate: Example Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766247 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766250 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766252 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766255 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766258 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766261 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766264 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766266 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766269 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766272 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766274 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766277 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 16:43:49.768786 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766280 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766283 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766286 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766288 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766291 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766293 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766296 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766298 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766301 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766303 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766306 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766308 2563 feature_gate.go:328] unrecognized feature gate: Example2 Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766311 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766313 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766315 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766318 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766320 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766323 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766326 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766329 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 16:43:49.769267 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766332 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766334 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766338 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766341 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766344 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766347 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766350 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766353 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766356 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766359 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766361 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766364 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766367 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:49.766369 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.766374 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 18 16:43:49.769814 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.767076 2563 server.go:962] "Client rotation is on, will bootstrap in background" Mar 18 16:43:49.770188 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.769088 2563 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 16:43:49.770255 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.770243 2563 server.go:1019] "Starting client certificate rotation" Mar 18 16:43:49.770358 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.770341 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:43:49.770395 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.770385 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 18 16:43:49.796544 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.796524 2563 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:43:49.800767 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.800749 2563 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 16:43:49.820586 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.820569 2563 log.go:25] "Validated CRI v1 runtime API" Mar 18 16:43:49.825322 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.825305 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:43:49.826472 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.826459 2563 log.go:25] "Validated CRI v1 image API" Mar 18 16:43:49.828633 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.828618 2563 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 16:43:49.832425 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.832408 2563 fs.go:135] Filesystem UUIDs: map[0dd0ac51-5b9a-4c79-9b1e-13e74d1f9ea4:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 93588640-6bfc-4ddf-8f87-358c7ec2a41a:/dev/nvme0n1p3] Mar 18 16:43:49.832479 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.832425 2563 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 18 16:43:49.838565 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.838449 2563 manager.go:217] Machine: {Timestamp:2026-03-18 16:43:49.836108137 +0000 UTC m=+0.428048699 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097933 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2237d4c965e2d439362cbac7dd1cb0 SystemUUID:ec2237d4-c965-e2d4-3936-2cbac7dd1cb0 BootID:2508b743-0454-4f09-a9b5-cf87a8ed19b7 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0d:48:26:26:99 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0d:48:26:26:99 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:70:0d:fd:cc:eb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 16:43:49.839054 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.839044 2563 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 16:43:49.839133 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.839122 2563 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 16:43:49.841477 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.841452 2563 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 16:43:49.841630 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.841480 2563 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-231.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 16:43:49.841673 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.841640 2563 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 16:43:49.841673 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.841649 2563 container_manager_linux.go:306] "Creating device plugin manager" Mar 18 16:43:49.841673 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.841666 2563 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:43:49.842494 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.842483 2563 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 16:43:49.843973 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.843963 2563 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:43:49.844072 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.844063 2563 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 18 16:43:49.846367 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.846348 2563 kubelet.go:491] "Attempting to sync node with API server" Mar 18 16:43:49.846367 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.846372 2563 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 16:43:49.846437 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.846383 2563 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 16:43:49.846437 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.846392 2563 kubelet.go:397] "Adding apiserver pod source" Mar 18 16:43:49.846437 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.846400 2563 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 16:43:49.847485 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.847471 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:43:49.847584 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.847573 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 18 16:43:49.850709 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.850694 2563 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 18 16:43:49.852199 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.852186 2563 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 18 16:43:49.854747 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854735 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 16:43:49.854789 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854756 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 16:43:49.854789 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854765 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 16:43:49.854789 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854770 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 16:43:49.854789 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854776 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 16:43:49.854789 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854781 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 16:43:49.854789 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854788 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 16:43:49.854942 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854793 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 16:43:49.854942 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854800 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 16:43:49.854942 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854805 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 16:43:49.854942 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854818 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 16:43:49.854942 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.854828 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 16:43:49.855625 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.855615 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 16:43:49.855660 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.855627 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 18 16:43:49.859129 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.859115 2563 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 18 16:43:49.859188 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.859149 2563 server.go:1295] "Started kubelet" Mar 18 16:43:49.859281 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.859243 2563 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 16:43:49.859338 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.859312 2563 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 18 16:43:49.859381 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.859319 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 18 16:43:49.859740 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.859724 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 18 16:43:49.859782 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.859264 2563 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 16:43:49.859826 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.859815 2563 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-231.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 16:43:49.860151 ip-10-0-141-231 systemd[1]: Started Kubernetes Kubelet. Mar 18 16:43:49.861221 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.861144 2563 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 16:43:49.861907 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.861887 2563 server.go:317] "Adding debug handlers to kubelet server" Mar 18 16:43:49.867963 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.867032 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-231.ec2.internal.189dfd346cc9af1f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-231.ec2.internal,UID:ip-10-0-141-231.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-231.ec2.internal,},FirstTimestamp:2026-03-18 16:43:49.859127071 +0000 UTC m=+0.451067613,LastTimestamp:2026-03-18 16:43:49.859127071 +0000 UTC m=+0.451067613,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-231.ec2.internal,}" Mar 18 16:43:49.868252 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.868233 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 18 16:43:49.869615 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.869597 2563 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 16:43:49.870250 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.870233 2563 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 18 16:43:49.870250 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.870253 2563 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 18 16:43:49.870383 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.870235 2563 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 18 16:43:49.870383 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.870344 2563 reconstruct.go:97] "Volume reconstruction finished" Mar 18 16:43:49.870383 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.870353 2563 reconciler.go:26] "Reconciler: start to sync state" Mar 18 16:43:49.870533 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.870393 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:49.871944 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.871682 2563 factory.go:55] Registering systemd factory Mar 18 16:43:49.871944 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.871741 2563 factory.go:223] Registration of the systemd container factory successfully Mar 18 16:43:49.872210 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.872196 2563 factory.go:153] Registering CRI-O factory Mar 18 16:43:49.872285 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.872213 2563 factory.go:223] Registration of the crio container factory successfully Mar 18 16:43:49.872285 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.872275 2563 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 16:43:49.872384 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.872294 2563 factory.go:103] Registering Raw factory Mar 18 16:43:49.872384 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.872308 2563 manager.go:1196] Started watching for new ooms in manager Mar 18 16:43:49.872740 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.872728 2563 manager.go:319] Starting recovery of all containers Mar 18 16:43:49.874167 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.873994 2563 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 18 16:43:49.881342 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.881188 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 18 16:43:49.881342 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.881298 2563 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-231.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Mar 18 16:43:49.882087 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.882068 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zj6pw" Mar 18 16:43:49.882698 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.882677 2563 manager.go:324] Recovery completed Mar 18 16:43:49.886716 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.886703 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:49.889179 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.889165 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:49.889236 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.889193 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:49.889236 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.889203 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:49.889927 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.889904 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zj6pw" Mar 18 16:43:49.890027 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.890008 2563 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 18 16:43:49.890027 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.890021 2563 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 18 16:43:49.890119 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.890039 2563 state_mem.go:36] "Initialized new in-memory state store" Mar 18 16:43:49.892367 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.892350 2563 policy_none.go:49] "None policy: Start" Mar 18 16:43:49.892428 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.892375 2563 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 18 16:43:49.892428 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.892390 2563 state_mem.go:35] "Initializing new in-memory state store" Mar 18 16:43:49.929130 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.929109 2563 manager.go:341] "Starting Device Plugin manager" Mar 18 16:43:49.929130 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.929137 2563 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.929146 2563 server.go:85] "Starting device plugin registration server" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.929389 2563 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.929400 2563 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.929472 2563 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.929675 2563 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.929685 2563 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.930118 2563 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.930164 2563 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.934945 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.936054 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.936073 2563 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.936088 2563 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.936095 2563 kubelet.go:2451] "Starting kubelet main sync loop" Mar 18 16:43:49.938775 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:49.936126 2563 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 18 16:43:49.939536 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:49.939523 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:50.030406 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.030326 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.031238 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.031223 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.031329 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.031256 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.031329 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.031270 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.031329 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.031300 2563 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.036370 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.036353 2563 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal"] Mar 18 16:43:50.036438 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.036409 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.037751 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.037732 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.037840 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.037757 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.037840 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.037768 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.038948 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.038936 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.039100 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.039086 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.039163 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.039121 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.039542 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.039526 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.039621 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.039552 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.039621 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.039566 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.039621 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.039599 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.039621 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.039616 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.039767 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.039630 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.040712 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.040699 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.040772 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.040723 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 18 16:43:50.041305 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.041292 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientMemory" Mar 18 16:43:50.041369 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.041317 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasNoDiskPressure" Mar 18 16:43:50.041369 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.041330 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeHasSufficientPID" Mar 18 16:43:50.041469 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.041455 2563 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.041515 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.041476 2563 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-231.ec2.internal\": node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:50.056019 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.056002 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:50.058999 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.058985 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-231.ec2.internal\" not found" node="ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.062728 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.062712 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-231.ec2.internal\" not found" node="ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.071546 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.071528 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/745299a5e6bc40770486c0ea9932489e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal\" (UID: \"745299a5e6bc40770486c0ea9932489e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.071613 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.071551 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/745299a5e6bc40770486c0ea9932489e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal\" (UID: \"745299a5e6bc40770486c0ea9932489e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.071613 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.071568 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c21613cafe918f7212fff6fba314410-config\") pod \"kube-apiserver-proxy-ip-10-0-141-231.ec2.internal\" (UID: \"4c21613cafe918f7212fff6fba314410\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.156336 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.156314 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:50.172673 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.172656 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/745299a5e6bc40770486c0ea9932489e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal\" (UID: \"745299a5e6bc40770486c0ea9932489e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.172738 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.172680 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/745299a5e6bc40770486c0ea9932489e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal\" (UID: \"745299a5e6bc40770486c0ea9932489e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.172738 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.172696 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c21613cafe918f7212fff6fba314410-config\") pod \"kube-apiserver-proxy-ip-10-0-141-231.ec2.internal\" (UID: \"4c21613cafe918f7212fff6fba314410\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.172812 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.172737 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c21613cafe918f7212fff6fba314410-config\") pod \"kube-apiserver-proxy-ip-10-0-141-231.ec2.internal\" (UID: \"4c21613cafe918f7212fff6fba314410\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.172812 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.172753 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/745299a5e6bc40770486c0ea9932489e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal\" (UID: \"745299a5e6bc40770486c0ea9932489e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.172812 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.172764 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/745299a5e6bc40770486c0ea9932489e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal\" (UID: \"745299a5e6bc40770486c0ea9932489e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.256812 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.256790 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:50.357651 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.357596 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:50.361754 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.361741 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.365292 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.365272 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal" Mar 18 16:43:50.458007 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.457979 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:50.558544 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.558517 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:50.658986 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.658931 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:50.759489 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.759465 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:50.769632 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.769611 2563 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 16:43:50.769780 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.769763 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 18 16:43:50.860426 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.860404 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:50.868532 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.868474 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 18 16:43:50.876694 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.876673 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 18 16:43:50.891837 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.891798 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-17 16:38:49 +0000 UTC" deadline="2027-12-13 06:30:46.11208534 +0000 UTC" Mar 18 16:43:50.891837 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.891835 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15229h46m55.22025358s" Mar 18 16:43:50.894217 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.894195 2563 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:50.914968 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.914926 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fmqh5" Mar 18 16:43:50.924718 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.924670 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fmqh5" Mar 18 16:43:50.943713 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:50.943685 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c21613cafe918f7212fff6fba314410.slice/crio-f33407da5a1b7241c9e67d95dd81f02fc6254cb738ae7d4d88dfe35250589308 WatchSource:0}: Error finding container f33407da5a1b7241c9e67d95dd81f02fc6254cb738ae7d4d88dfe35250589308: Status 404 returned error can't find the container with id f33407da5a1b7241c9e67d95dd81f02fc6254cb738ae7d4d88dfe35250589308 Mar 18 16:43:50.948253 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:50.948239 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:43:50.961446 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:50.961429 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:51.007901 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:51.007879 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod745299a5e6bc40770486c0ea9932489e.slice/crio-fbee49530b1fb216ae36c9d8c97ea26a2341e572b3afe08266d0de2ce2fb04ad WatchSource:0}: Error finding container fbee49530b1fb216ae36c9d8c97ea26a2341e572b3afe08266d0de2ce2fb04ad: Status 404 returned error can't find the container with id fbee49530b1fb216ae36c9d8c97ea26a2341e572b3afe08266d0de2ce2fb04ad Mar 18 16:43:51.062161 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:51.062140 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:51.137536 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.137516 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:51.163016 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:51.163000 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:51.263624 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:51.263603 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:51.364357 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:51.364332 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-231.ec2.internal\" not found" Mar 18 16:43:51.421077 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.420986 2563 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:51.470029 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.469971 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" Mar 18 16:43:51.484582 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.484557 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:43:51.485539 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.485523 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal" Mar 18 16:43:51.493462 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.493444 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 18 16:43:51.847911 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.847850 2563 apiserver.go:52] "Watching apiserver" Mar 18 16:43:51.853335 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.853314 2563 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 18 16:43:51.854519 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.854483 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-dsml5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6","openshift-image-registry/node-ca-f9chd","openshift-multus/network-metrics-daemon-kbxwz","openshift-network-diagnostics/network-check-target-s8rrk","openshift-network-operator/iptables-alerter-nztqh","kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal","openshift-cluster-node-tuning-operator/tuned-lxcdc","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal","openshift-multus/multus-additional-cni-plugins-jxfqg","openshift-multus/multus-glstl","openshift-ovn-kubernetes/ovnkube-node-9gjw2"] Mar 18 16:43:51.856196 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.856178 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:43:51.858301 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.858204 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-974rr\"" Mar 18 16:43:51.858301 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.858213 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 18 16:43:51.858301 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.858228 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 18 16:43:51.858725 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.858701 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.858807 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.858772 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:51.860392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.860207 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:51.860392 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:51.860284 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:43:51.860392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.860357 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 18 16:43:51.860605 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.860428 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sf4nn\"" Mar 18 16:43:51.860675 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.860656 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 18 16:43:51.860786 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.860768 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 18 16:43:51.860841 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.860787 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mh6jl\"" Mar 18 16:43:51.860890 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.860842 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 18 16:43:51.861036 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.860939 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 18 16:43:51.861156 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.861142 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 18 16:43:51.861436 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.861423 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:51.861539 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:51.861485 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:43:51.862864 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.862843 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:51.864087 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.864069 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.867596 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.865226 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:51.867596 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.865298 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 18 16:43:51.867596 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.865449 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:51.867596 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.865719 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kg6fd\"" Mar 18 16:43:51.867596 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.866751 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-47rlt\"" Mar 18 16:43:51.867596 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.867018 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 18 16:43:51.867596 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.867045 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:43:51.867596 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.867452 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.869069 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.869045 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-glstl" Mar 18 16:43:51.869366 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.869350 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 18 16:43:51.869777 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.869759 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 18 16:43:51.869777 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.869771 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 18 16:43:51.869948 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.869777 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 18 16:43:51.870794 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.870493 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-rrr6x\"" Mar 18 16:43:51.870794 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.870520 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.870794 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.870599 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 18 16:43:51.871272 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.871220 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-nnrtd\"" Mar 18 16:43:51.871272 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.871269 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 18 16:43:51.871526 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.871495 2563 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 18 16:43:51.872700 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.872686 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 18 16:43:51.873358 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.873339 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 18 16:43:51.873358 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.873351 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 18 16:43:51.873710 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.873692 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 18 16:43:51.873791 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.873699 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rq8lg\"" Mar 18 16:43:51.874030 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.874016 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 18 16:43:51.874755 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.874738 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 18 16:43:51.882823 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.882807 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/41131912-91a1-43ad-a23a-203bd6091794-etc-tuned\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.882917 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.882862 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxcm6\" (UniqueName: \"kubernetes.io/projected/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-kube-api-access-vxcm6\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.882917 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.882887 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-registration-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.882917 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.882913 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdc5r\" (UniqueName: \"kubernetes.io/projected/2e2ef5c8-0952-4eef-a0d9-19656f20a5a5-kube-api-access-kdc5r\") pod \"iptables-alerter-nztqh\" (UID: \"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5\") " pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:51.883045 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.882935 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-os-release\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.883045 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.882975 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftg87\" (UniqueName: \"kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87\") pod \"network-check-target-s8rrk\" (UID: \"fde7e0f5-379e-4950-a766-5b94afe18049\") " pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:51.883045 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883009 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-var-lib-kubelet\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.883045 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883027 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-cni-dir\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.883231 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883054 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-daemon-config\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.883231 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883085 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-systemd-units\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883231 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883117 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2da91ba-0645-46a4-a59d-b7219ba40de9-ovnkube-config\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883231 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883137 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.883231 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883157 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-socket-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.883231 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883175 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-sys-fs\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.883231 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883192 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-sysconfig\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.883511 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883241 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh5kx\" (UniqueName: \"kubernetes.io/projected/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-kube-api-access-bh5kx\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.883511 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883295 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zxnd\" (UniqueName: \"kubernetes.io/projected/4aac1f96-b5de-49d5-84a3-176806d103dc-kube-api-access-6zxnd\") pod \"node-ca-f9chd\" (UID: \"4aac1f96-b5de-49d5-84a3-176806d103dc\") " pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:51.883511 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883332 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-host\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.883511 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883362 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-system-cni-dir\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.883511 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883385 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-cni-netd\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883511 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883415 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-var-lib-cni-multus\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.883511 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883452 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-kubelet\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883511 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883477 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-run-systemd\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883516 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-log-socket\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883556 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-kubernetes\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883584 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhq76\" (UniqueName: \"kubernetes.io/projected/41131912-91a1-43ad-a23a-203bd6091794-kube-api-access-dhq76\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883627 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-cni-binary-copy\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883652 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-cnibin\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883677 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aac1f96-b5de-49d5-84a3-176806d103dc-host\") pod \"node-ca-f9chd\" (UID: \"4aac1f96-b5de-49d5-84a3-176806d103dc\") " pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883701 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e2ef5c8-0952-4eef-a0d9-19656f20a5a5-iptables-alerter-script\") pod \"iptables-alerter-nztqh\" (UID: \"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5\") " pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883725 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-run-multus-certs\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883755 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-slash\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883782 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-run-ovn\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883808 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-sys\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883833 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-run-netns\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883855 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-run-netns\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883879 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-var-lib-openvswitch\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883901 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-run-openvswitch\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.883941 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883925 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-sysctl-conf\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883949 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-var-lib-kubelet\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.883976 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-conf-dir\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884033 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-etc-openvswitch\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884057 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7ef73172-0617-4d44-b9f4-f5d3832924d2-agent-certs\") pod \"konnectivity-agent-dsml5\" (UID: \"7ef73172-0617-4d44-b9f4-f5d3832924d2\") " pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884082 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-device-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884113 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e2ef5c8-0952-4eef-a0d9-19656f20a5a5-host-slash\") pod \"iptables-alerter-nztqh\" (UID: \"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5\") " pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884138 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-cnibin\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884169 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-node-log\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884208 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9s8k\" (UniqueName: \"kubernetes.io/projected/c2da91ba-0645-46a4-a59d-b7219ba40de9-kube-api-access-q9s8k\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884305 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4aac1f96-b5de-49d5-84a3-176806d103dc-serviceca\") pod \"node-ca-f9chd\" (UID: \"4aac1f96-b5de-49d5-84a3-176806d103dc\") " pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884333 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7ef73172-0617-4d44-b9f4-f5d3832924d2-konnectivity-ca\") pod \"konnectivity-agent-dsml5\" (UID: \"7ef73172-0617-4d44-b9f4-f5d3832924d2\") " pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884352 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-etc-selinux\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884368 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884385 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-os-release\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884406 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-cni-binary-copy\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.884619 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884420 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-run-k8s-cni-cncf-io\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884436 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-var-lib-cni-bin\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884457 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-etc-kubernetes\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884495 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2da91ba-0645-46a4-a59d-b7219ba40de9-env-overrides\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884542 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884573 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-run\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884612 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-socket-dir-parent\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884642 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-hostroot\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884682 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-run-ovn-kubernetes\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884706 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2da91ba-0645-46a4-a59d-b7219ba40de9-ovn-node-metrics-cert\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884739 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2da91ba-0645-46a4-a59d-b7219ba40de9-ovnkube-script-lib\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884760 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884791 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-lib-modules\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884807 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41131912-91a1-43ad-a23a-203bd6091794-tmp\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884827 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf9t6\" (UniqueName: \"kubernetes.io/projected/c0b2d8c5-09a4-472a-aad2-fa033be042f3-kube-api-access-pf9t6\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884849 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-sysctl-d\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.885227 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884871 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-systemd\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.885850 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884892 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-cni-bin\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.885850 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884917 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.885850 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.884960 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:51.885850 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.885007 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-system-cni-dir\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.885850 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.885034 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x88f\" (UniqueName: \"kubernetes.io/projected/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-kube-api-access-9x88f\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.885850 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.885058 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-modprobe-d\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.925908 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.925882 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:38:50 +0000 UTC" deadline="2027-12-05 11:26:29.245135056 +0000 UTC" Mar 18 16:43:51.925908 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.925904 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15042h42m37.31923484s" Mar 18 16:43:51.941743 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.941697 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal" event={"ID":"4c21613cafe918f7212fff6fba314410","Type":"ContainerStarted","Data":"f33407da5a1b7241c9e67d95dd81f02fc6254cb738ae7d4d88dfe35250589308"} Mar 18 16:43:51.942737 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.942714 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" event={"ID":"745299a5e6bc40770486c0ea9932489e","Type":"ContainerStarted","Data":"fbee49530b1fb216ae36c9d8c97ea26a2341e572b3afe08266d0de2ce2fb04ad"} Mar 18 16:43:51.985886 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.985866 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-cni-binary-copy\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986004 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.985896 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-run-k8s-cni-cncf-io\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986004 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.985912 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-var-lib-cni-bin\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986004 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.985941 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-etc-kubernetes\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986004 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.985989 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-etc-kubernetes\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986004 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.985994 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-run-k8s-cni-cncf-io\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986011 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-var-lib-cni-bin\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986027 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2da91ba-0645-46a4-a59d-b7219ba40de9-env-overrides\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986063 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986088 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-run\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986112 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-socket-dir-parent\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986128 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-hostroot\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986146 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-run-ovn-kubernetes\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986171 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2da91ba-0645-46a4-a59d-b7219ba40de9-ovn-node-metrics-cert\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986195 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2da91ba-0645-46a4-a59d-b7219ba40de9-ovnkube-script-lib\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986206 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-socket-dir-parent\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986208 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-hostroot\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986219 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986245 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-lib-modules\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.986254 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986254 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-run\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986379 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-lib-modules\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986384 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986448 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-run-ovn-kubernetes\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986488 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41131912-91a1-43ad-a23a-203bd6091794-tmp\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986537 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pf9t6\" (UniqueName: \"kubernetes.io/projected/c0b2d8c5-09a4-472a-aad2-fa033be042f3-kube-api-access-pf9t6\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986565 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-sysctl-d\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986611 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-systemd\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986634 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-cni-bin\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986659 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986682 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-systemd\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986683 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986568 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-cni-binary-copy\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986729 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-sysctl-d\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986737 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-cni-bin\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986582 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986577 2563 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:51.986779 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:51.986904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986785 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986830 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-system-cni-dir\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:51.986863 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs podName:c0b2d8c5-09a4-472a-aad2-fa033be042f3 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:52.486825489 +0000 UTC m=+3.078766025 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs") pod "network-metrics-daemon-kbxwz" (UID: "c0b2d8c5-09a4-472a-aad2-fa033be042f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986876 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-system-cni-dir\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986893 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x88f\" (UniqueName: \"kubernetes.io/projected/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-kube-api-access-9x88f\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986921 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-modprobe-d\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986945 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/41131912-91a1-43ad-a23a-203bd6091794-etc-tuned\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.986991 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxcm6\" (UniqueName: \"kubernetes.io/projected/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-kube-api-access-vxcm6\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987014 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-registration-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987032 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdc5r\" (UniqueName: \"kubernetes.io/projected/2e2ef5c8-0952-4eef-a0d9-19656f20a5a5-kube-api-access-kdc5r\") pod \"iptables-alerter-nztqh\" (UID: \"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5\") " pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987056 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-os-release\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987318 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-registration-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987350 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2da91ba-0645-46a4-a59d-b7219ba40de9-env-overrides\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987408 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2da91ba-0645-46a4-a59d-b7219ba40de9-ovnkube-script-lib\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987434 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-modprobe-d\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987441 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftg87\" (UniqueName: \"kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87\") pod \"network-check-target-s8rrk\" (UID: \"fde7e0f5-379e-4950-a766-5b94afe18049\") " pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:51.987683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987472 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-var-lib-kubelet\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987531 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-cni-dir\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987557 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-daemon-config\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987564 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-var-lib-kubelet\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987580 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-systemd-units\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987477 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-os-release\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987624 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-systemd-units\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987653 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2da91ba-0645-46a4-a59d-b7219ba40de9-ovnkube-config\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987680 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987692 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-cni-dir\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987704 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-socket-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987748 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987749 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-sys-fs\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987783 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-sysconfig\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987792 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-sys-fs\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987810 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh5kx\" (UniqueName: \"kubernetes.io/projected/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-kube-api-access-bh5kx\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987840 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zxnd\" (UniqueName: \"kubernetes.io/projected/4aac1f96-b5de-49d5-84a3-176806d103dc-kube-api-access-6zxnd\") pod \"node-ca-f9chd\" (UID: \"4aac1f96-b5de-49d5-84a3-176806d103dc\") " pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:51.988392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987884 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-host\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987899 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-socket-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987908 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-system-cni-dir\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987933 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-cni-netd\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987967 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-var-lib-cni-multus\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.987992 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-kubelet\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988036 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-cni-netd\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988036 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-var-lib-cni-multus\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988051 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-sysconfig\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988074 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-system-cni-dir\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988103 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-run-systemd\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988119 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-kubelet\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988124 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2da91ba-0645-46a4-a59d-b7219ba40de9-ovnkube-config\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988132 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-log-socket\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988160 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-kubernetes\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988181 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-run-systemd\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988187 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhq76\" (UniqueName: \"kubernetes.io/projected/41131912-91a1-43ad-a23a-203bd6091794-kube-api-access-dhq76\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988161 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-host\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.989124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988209 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-log-socket\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988215 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-cni-binary-copy\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988253 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-cnibin\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988275 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-kubernetes\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988277 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aac1f96-b5de-49d5-84a3-176806d103dc-host\") pod \"node-ca-f9chd\" (UID: \"4aac1f96-b5de-49d5-84a3-176806d103dc\") " pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988307 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4aac1f96-b5de-49d5-84a3-176806d103dc-host\") pod \"node-ca-f9chd\" (UID: \"4aac1f96-b5de-49d5-84a3-176806d103dc\") " pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988327 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-cnibin\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988332 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e2ef5c8-0952-4eef-a0d9-19656f20a5a5-iptables-alerter-script\") pod \"iptables-alerter-nztqh\" (UID: \"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5\") " pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988368 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-run-multus-certs\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988398 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-slash\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988421 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-run-ovn\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988461 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-slash\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988460 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-run-multus-certs\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988476 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-sys\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988515 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-run-ovn\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988526 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-run-netns\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988534 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-sys\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988554 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-run-netns\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.989964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988598 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-var-lib-openvswitch\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988629 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-run-openvswitch\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988633 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-host-run-netns\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988640 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-cni-binary-copy\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988648 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-var-lib-openvswitch\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988653 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-sysctl-conf\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988601 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-run-netns\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988686 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-var-lib-kubelet\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988709 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-conf-dir\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988718 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-host-var-lib-kubelet\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988731 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-etc-openvswitch\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988686 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-run-openvswitch\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988766 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-conf-dir\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988772 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/41131912-91a1-43ad-a23a-203bd6091794-etc-sysctl-conf\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988787 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-etc-openvswitch\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988794 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7ef73172-0617-4d44-b9f4-f5d3832924d2-agent-certs\") pod \"konnectivity-agent-dsml5\" (UID: \"7ef73172-0617-4d44-b9f4-f5d3832924d2\") " pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988804 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2e2ef5c8-0952-4eef-a0d9-19656f20a5a5-iptables-alerter-script\") pod \"iptables-alerter-nztqh\" (UID: \"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5\") " pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988822 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-device-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.990798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988870 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e2ef5c8-0952-4eef-a0d9-19656f20a5a5-host-slash\") pod \"iptables-alerter-nztqh\" (UID: \"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5\") " pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988894 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-cnibin\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988918 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-node-log\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988918 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-device-dir\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988918 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e2ef5c8-0952-4eef-a0d9-19656f20a5a5-host-slash\") pod \"iptables-alerter-nztqh\" (UID: \"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5\") " pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988960 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-cnibin\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988957 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2da91ba-0645-46a4-a59d-b7219ba40de9-node-log\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.988983 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9s8k\" (UniqueName: \"kubernetes.io/projected/c2da91ba-0645-46a4-a59d-b7219ba40de9-kube-api-access-q9s8k\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989011 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4aac1f96-b5de-49d5-84a3-176806d103dc-serviceca\") pod \"node-ca-f9chd\" (UID: \"4aac1f96-b5de-49d5-84a3-176806d103dc\") " pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989023 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-multus-daemon-config\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989036 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7ef73172-0617-4d44-b9f4-f5d3832924d2-konnectivity-ca\") pod \"konnectivity-agent-dsml5\" (UID: \"7ef73172-0617-4d44-b9f4-f5d3832924d2\") " pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989074 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-etc-selinux\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989136 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989178 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-os-release\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989303 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-os-release\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989312 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-etc-selinux\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989419 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4aac1f96-b5de-49d5-84a3-176806d103dc-serviceca\") pod \"node-ca-f9chd\" (UID: \"4aac1f96-b5de-49d5-84a3-176806d103dc\") " pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:51.991664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989516 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7ef73172-0617-4d44-b9f4-f5d3832924d2-konnectivity-ca\") pod \"konnectivity-agent-dsml5\" (UID: \"7ef73172-0617-4d44-b9f4-f5d3832924d2\") " pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:43:51.992374 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.989868 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:51.992374 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.990374 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/41131912-91a1-43ad-a23a-203bd6091794-etc-tuned\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.992374 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.991402 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41131912-91a1-43ad-a23a-203bd6091794-tmp\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:51.992374 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.991444 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2da91ba-0645-46a4-a59d-b7219ba40de9-ovn-node-metrics-cert\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:51.992374 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.992336 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7ef73172-0617-4d44-b9f4-f5d3832924d2-agent-certs\") pod \"konnectivity-agent-dsml5\" (UID: \"7ef73172-0617-4d44-b9f4-f5d3832924d2\") " pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:43:51.994261 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:51.994240 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf9t6\" (UniqueName: \"kubernetes.io/projected/c0b2d8c5-09a4-472a-aad2-fa033be042f3-kube-api-access-pf9t6\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:52.001355 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.001337 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdc5r\" (UniqueName: \"kubernetes.io/projected/2e2ef5c8-0952-4eef-a0d9-19656f20a5a5-kube-api-access-kdc5r\") pod \"iptables-alerter-nztqh\" (UID: \"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5\") " pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:52.001669 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.001633 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x88f\" (UniqueName: \"kubernetes.io/projected/ef8cdb92-dfe0-4b41-8256-a466ea85d67a-kube-api-access-9x88f\") pod \"multus-additional-cni-plugins-jxfqg\" (UID: \"ef8cdb92-dfe0-4b41-8256-a466ea85d67a\") " pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:52.002410 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:52.002173 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:52.002410 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:52.002193 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:52.002410 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:52.002213 2563 projected.go:194] Error preparing data for projected volume kube-api-access-ftg87 for pod openshift-network-diagnostics/network-check-target-s8rrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:52.002410 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:52.002266 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87 podName:fde7e0f5-379e-4950-a766-5b94afe18049 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:52.502250607 +0000 UTC m=+3.094191141 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ftg87" (UniqueName: "kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87") pod "network-check-target-s8rrk" (UID: "fde7e0f5-379e-4950-a766-5b94afe18049") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:52.002789 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.002770 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxcm6\" (UniqueName: \"kubernetes.io/projected/777f8e63-1d9b-4424-b5e7-62a6ccb4658f-kube-api-access-vxcm6\") pod \"multus-glstl\" (UID: \"777f8e63-1d9b-4424-b5e7-62a6ccb4658f\") " pod="openshift-multus/multus-glstl" Mar 18 16:43:52.003953 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.003936 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhq76\" (UniqueName: \"kubernetes.io/projected/41131912-91a1-43ad-a23a-203bd6091794-kube-api-access-dhq76\") pod \"tuned-lxcdc\" (UID: \"41131912-91a1-43ad-a23a-203bd6091794\") " pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:52.004045 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.004031 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zxnd\" (UniqueName: \"kubernetes.io/projected/4aac1f96-b5de-49d5-84a3-176806d103dc-kube-api-access-6zxnd\") pod \"node-ca-f9chd\" (UID: \"4aac1f96-b5de-49d5-84a3-176806d103dc\") " pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:52.004789 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.004769 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9s8k\" (UniqueName: \"kubernetes.io/projected/c2da91ba-0645-46a4-a59d-b7219ba40de9-kube-api-access-q9s8k\") pod \"ovnkube-node-9gjw2\" (UID: \"c2da91ba-0645-46a4-a59d-b7219ba40de9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:52.005335 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.005316 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh5kx\" (UniqueName: \"kubernetes.io/projected/e02e2a33-e5cb-4c07-8e13-ed2992b81a67-kube-api-access-bh5kx\") pod \"aws-ebs-csi-driver-node-mvvf6\" (UID: \"e02e2a33-e5cb-4c07-8e13-ed2992b81a67\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:52.168882 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.168798 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:43:52.175777 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.175752 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" Mar 18 16:43:52.184579 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.184559 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f9chd" Mar 18 16:43:52.192199 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.192182 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nztqh" Mar 18 16:43:52.199299 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.199282 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" Mar 18 16:43:52.205836 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.205819 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" Mar 18 16:43:52.213555 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.213538 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-glstl" Mar 18 16:43:52.214050 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.213920 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 18 16:43:52.219097 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.219081 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:43:52.491822 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.491749 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:52.491961 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:52.491905 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:52.492010 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:52.491977 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs podName:c0b2d8c5-09a4-472a-aad2-fa033be042f3 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.49195691 +0000 UTC m=+4.083897449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs") pod "network-metrics-daemon-kbxwz" (UID: "c0b2d8c5-09a4-472a-aad2-fa033be042f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:52.593010 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.592976 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftg87\" (UniqueName: \"kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87\") pod \"network-check-target-s8rrk\" (UID: \"fde7e0f5-379e-4950-a766-5b94afe18049\") " pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:52.593196 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:52.593129 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:52.593196 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:52.593151 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:52.593196 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:52.593162 2563 projected.go:194] Error preparing data for projected volume kube-api-access-ftg87 for pod openshift-network-diagnostics/network-check-target-s8rrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:52.593295 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:52.593218 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87 podName:fde7e0f5-379e-4950-a766-5b94afe18049 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:53.593202295 +0000 UTC m=+4.185142829 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ftg87" (UniqueName: "kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87") pod "network-check-target-s8rrk" (UID: "fde7e0f5-379e-4950-a766-5b94afe18049") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:52.638125 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:52.638099 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode02e2a33_e5cb_4c07_8e13_ed2992b81a67.slice/crio-f6674af436b0ff173234841889761bbebc111c58069b568fb7e35e5ede990895 WatchSource:0}: Error finding container f6674af436b0ff173234841889761bbebc111c58069b568fb7e35e5ede990895: Status 404 returned error can't find the container with id f6674af436b0ff173234841889761bbebc111c58069b568fb7e35e5ede990895 Mar 18 16:43:52.639544 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:52.639418 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aac1f96_b5de_49d5_84a3_176806d103dc.slice/crio-7041db8a9525d40775191a8ea96576ca960cebe0d19c8a1663c3056f2d4135dd WatchSource:0}: Error finding container 7041db8a9525d40775191a8ea96576ca960cebe0d19c8a1663c3056f2d4135dd: Status 404 returned error can't find the container with id 7041db8a9525d40775191a8ea96576ca960cebe0d19c8a1663c3056f2d4135dd Mar 18 16:43:52.642852 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:52.642829 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41131912_91a1_43ad_a23a_203bd6091794.slice/crio-4d9fc8717cc4a10279a79f5b7cf7df9134e754dfd7c9fa57c7d01a7d6dd5cf19 WatchSource:0}: Error finding container 4d9fc8717cc4a10279a79f5b7cf7df9134e754dfd7c9fa57c7d01a7d6dd5cf19: Status 404 returned error can't find the container with id 4d9fc8717cc4a10279a79f5b7cf7df9134e754dfd7c9fa57c7d01a7d6dd5cf19 Mar 18 16:43:52.643761 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:52.643744 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e2ef5c8_0952_4eef_a0d9_19656f20a5a5.slice/crio-7c61049ec0ba4d1c9d8c07ddf599de2d633475983d2edb37bf3e692167887c88 WatchSource:0}: Error finding container 7c61049ec0ba4d1c9d8c07ddf599de2d633475983d2edb37bf3e692167887c88: Status 404 returned error can't find the container with id 7c61049ec0ba4d1c9d8c07ddf599de2d633475983d2edb37bf3e692167887c88 Mar 18 16:43:52.651065 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:43:52.651039 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef8cdb92_dfe0_4b41_8256_a466ea85d67a.slice/crio-f963985118c51c0156d92362e37fd63559be70635d94f2f7fdcd5a50d51287d1 WatchSource:0}: Error finding container f963985118c51c0156d92362e37fd63559be70635d94f2f7fdcd5a50d51287d1: Status 404 returned error can't find the container with id f963985118c51c0156d92362e37fd63559be70635d94f2f7fdcd5a50d51287d1 Mar 18 16:43:52.926433 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.926346 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-17 16:38:50 +0000 UTC" deadline="2027-10-26 22:33:17.048833859 +0000 UTC" Mar 18 16:43:52.926433 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.926374 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14093h49m24.122462515s" Mar 18 16:43:52.946451 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.946421 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" event={"ID":"ef8cdb92-dfe0-4b41-8256-a466ea85d67a","Type":"ContainerStarted","Data":"f963985118c51c0156d92362e37fd63559be70635d94f2f7fdcd5a50d51287d1"} Mar 18 16:43:52.947758 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.947734 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-glstl" event={"ID":"777f8e63-1d9b-4424-b5e7-62a6ccb4658f","Type":"ContainerStarted","Data":"44b5618215bc14ed0ffff1ab53fa9728c2273a699e20b8813e26e1fe019f4441"} Mar 18 16:43:52.950122 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.950095 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nztqh" event={"ID":"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5","Type":"ContainerStarted","Data":"7c61049ec0ba4d1c9d8c07ddf599de2d633475983d2edb37bf3e692167887c88"} Mar 18 16:43:52.950979 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.950957 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f9chd" event={"ID":"4aac1f96-b5de-49d5-84a3-176806d103dc","Type":"ContainerStarted","Data":"7041db8a9525d40775191a8ea96576ca960cebe0d19c8a1663c3056f2d4135dd"} Mar 18 16:43:52.953338 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.953315 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal" event={"ID":"4c21613cafe918f7212fff6fba314410","Type":"ContainerStarted","Data":"c4387e4091f434b3598c840ce6f0461f09720bc4a0fda6945b556a9f1ff0ce0b"} Mar 18 16:43:52.954219 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.954200 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dsml5" event={"ID":"7ef73172-0617-4d44-b9f4-f5d3832924d2","Type":"ContainerStarted","Data":"f19be2096243ca262d012093372551ff79a524d33c9bb72b6cdd7478be4ab8ea"} Mar 18 16:43:52.955218 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.955198 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" event={"ID":"c2da91ba-0645-46a4-a59d-b7219ba40de9","Type":"ContainerStarted","Data":"5677d75c73d49e4be84c0239782a2964d7cf0ae0393203cab3fb5d69bf4cbbbb"} Mar 18 16:43:52.956148 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.956128 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" event={"ID":"41131912-91a1-43ad-a23a-203bd6091794","Type":"ContainerStarted","Data":"4d9fc8717cc4a10279a79f5b7cf7df9134e754dfd7c9fa57c7d01a7d6dd5cf19"} Mar 18 16:43:52.957314 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.957292 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" event={"ID":"e02e2a33-e5cb-4c07-8e13-ed2992b81a67","Type":"ContainerStarted","Data":"f6674af436b0ff173234841889761bbebc111c58069b568fb7e35e5ede990895"} Mar 18 16:43:52.966401 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:52.966365 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-231.ec2.internal" podStartSLOduration=1.9663550669999998 podStartE2EDuration="1.966355067s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:43:52.966273762 +0000 UTC m=+3.558214325" watchObservedRunningTime="2026-03-18 16:43:52.966355067 +0000 UTC m=+3.558295617" Mar 18 16:43:53.502543 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:53.502510 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:53.502699 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:53.502683 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:53.502762 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:53.502745 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs podName:c0b2d8c5-09a4-472a-aad2-fa033be042f3 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:55.502725245 +0000 UTC m=+6.094665775 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs") pod "network-metrics-daemon-kbxwz" (UID: "c0b2d8c5-09a4-472a-aad2-fa033be042f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:53.605516 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:53.603443 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftg87\" (UniqueName: \"kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87\") pod \"network-check-target-s8rrk\" (UID: \"fde7e0f5-379e-4950-a766-5b94afe18049\") " pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:53.605516 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:53.603625 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:53.605516 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:53.603643 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:53.605516 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:53.603655 2563 projected.go:194] Error preparing data for projected volume kube-api-access-ftg87 for pod openshift-network-diagnostics/network-check-target-s8rrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:53.605516 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:53.603708 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87 podName:fde7e0f5-379e-4950-a766-5b94afe18049 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:55.603689312 +0000 UTC m=+6.195629856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ftg87" (UniqueName: "kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87") pod "network-check-target-s8rrk" (UID: "fde7e0f5-379e-4950-a766-5b94afe18049") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:53.939418 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:53.938778 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:53.939418 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:53.938900 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:43:53.939418 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:53.939285 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:53.939418 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:53.939370 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:43:53.973544 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:53.972816 2563 generic.go:358] "Generic (PLEG): container finished" podID="745299a5e6bc40770486c0ea9932489e" containerID="fd5d0ce141a3946e282640a88f2e3865283ea6d5cafb98f8f144d1c69f317ad2" exitCode=0 Mar 18 16:43:53.973683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:53.973550 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" event={"ID":"745299a5e6bc40770486c0ea9932489e","Type":"ContainerDied","Data":"fd5d0ce141a3946e282640a88f2e3865283ea6d5cafb98f8f144d1c69f317ad2"} Mar 18 16:43:54.978587 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:54.978495 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" event={"ID":"745299a5e6bc40770486c0ea9932489e","Type":"ContainerStarted","Data":"45ab3811b5e1bc53f2744d34dad4f3cf179d07a41af00ef59fe01a7d6239ed0b"} Mar 18 16:43:55.517024 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:55.516988 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:55.517209 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:55.517153 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:55.517269 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:55.517226 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs podName:c0b2d8c5-09a4-472a-aad2-fa033be042f3 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:59.51720767 +0000 UTC m=+10.109148199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs") pod "network-metrics-daemon-kbxwz" (UID: "c0b2d8c5-09a4-472a-aad2-fa033be042f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:55.617694 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:55.617617 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftg87\" (UniqueName: \"kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87\") pod \"network-check-target-s8rrk\" (UID: \"fde7e0f5-379e-4950-a766-5b94afe18049\") " pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:55.617856 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:55.617774 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:55.617856 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:55.617798 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:55.617856 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:55.617811 2563 projected.go:194] Error preparing data for projected volume kube-api-access-ftg87 for pod openshift-network-diagnostics/network-check-target-s8rrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:55.618035 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:55.617872 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87 podName:fde7e0f5-379e-4950-a766-5b94afe18049 nodeName:}" failed. No retries permitted until 2026-03-18 16:43:59.61785332 +0000 UTC m=+10.209793856 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ftg87" (UniqueName: "kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87") pod "network-check-target-s8rrk" (UID: "fde7e0f5-379e-4950-a766-5b94afe18049") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:55.937118 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:55.936993 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:55.937118 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:55.937020 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:55.937373 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:55.937134 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:43:55.937592 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:55.937545 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:43:57.936618 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:57.936585 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:57.936618 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:57.936622 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:57.937136 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:57.936735 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:43:57.937136 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:57.936872 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:43:59.401154 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.401081 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-231.ec2.internal" podStartSLOduration=8.401063306 podStartE2EDuration="8.401063306s" podCreationTimestamp="2026-03-18 16:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:43:54.990920061 +0000 UTC m=+5.582860613" watchObservedRunningTime="2026-03-18 16:43:59.401063306 +0000 UTC m=+9.993003861" Mar 18 16:43:59.401841 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.401805 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-mlpnc"] Mar 18 16:43:59.406232 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.406211 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.408659 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.408638 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 18 16:43:59.408659 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.408651 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-msxvk\"" Mar 18 16:43:59.409187 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.409168 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 18 16:43:59.450251 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.450139 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx7k6\" (UniqueName: \"kubernetes.io/projected/d2859c82-5f61-4503-bb69-2147a77ca895-kube-api-access-lx7k6\") pod \"node-resolver-mlpnc\" (UID: \"d2859c82-5f61-4503-bb69-2147a77ca895\") " pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.450251 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.450185 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2859c82-5f61-4503-bb69-2147a77ca895-tmp-dir\") pod \"node-resolver-mlpnc\" (UID: \"d2859c82-5f61-4503-bb69-2147a77ca895\") " pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.450434 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.450303 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2859c82-5f61-4503-bb69-2147a77ca895-hosts-file\") pod \"node-resolver-mlpnc\" (UID: \"d2859c82-5f61-4503-bb69-2147a77ca895\") " pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.551519 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.551471 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:59.551686 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.551553 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2859c82-5f61-4503-bb69-2147a77ca895-hosts-file\") pod \"node-resolver-mlpnc\" (UID: \"d2859c82-5f61-4503-bb69-2147a77ca895\") " pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.551686 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.551597 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lx7k6\" (UniqueName: \"kubernetes.io/projected/d2859c82-5f61-4503-bb69-2147a77ca895-kube-api-access-lx7k6\") pod \"node-resolver-mlpnc\" (UID: \"d2859c82-5f61-4503-bb69-2147a77ca895\") " pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.551686 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:59.551611 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:59.551686 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.551622 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2859c82-5f61-4503-bb69-2147a77ca895-tmp-dir\") pod \"node-resolver-mlpnc\" (UID: \"d2859c82-5f61-4503-bb69-2147a77ca895\") " pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.551686 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:59.551668 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs podName:c0b2d8c5-09a4-472a-aad2-fa033be042f3 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:07.551651401 +0000 UTC m=+18.143591940 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs") pod "network-metrics-daemon-kbxwz" (UID: "c0b2d8c5-09a4-472a-aad2-fa033be042f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:43:59.551964 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.551934 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d2859c82-5f61-4503-bb69-2147a77ca895-hosts-file\") pod \"node-resolver-mlpnc\" (UID: \"d2859c82-5f61-4503-bb69-2147a77ca895\") " pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.552600 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.552578 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d2859c82-5f61-4503-bb69-2147a77ca895-tmp-dir\") pod \"node-resolver-mlpnc\" (UID: \"d2859c82-5f61-4503-bb69-2147a77ca895\") " pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.565712 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.565687 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx7k6\" (UniqueName: \"kubernetes.io/projected/d2859c82-5f61-4503-bb69-2147a77ca895-kube-api-access-lx7k6\") pod \"node-resolver-mlpnc\" (UID: \"d2859c82-5f61-4503-bb69-2147a77ca895\") " pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.652355 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.652283 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftg87\" (UniqueName: \"kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87\") pod \"network-check-target-s8rrk\" (UID: \"fde7e0f5-379e-4950-a766-5b94afe18049\") " pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:59.652492 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:59.652451 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:43:59.652492 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:59.652471 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:43:59.652492 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:59.652484 2563 projected.go:194] Error preparing data for projected volume kube-api-access-ftg87 for pod openshift-network-diagnostics/network-check-target-s8rrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:59.652673 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:59.652560 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87 podName:fde7e0f5-379e-4950-a766-5b94afe18049 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:07.652540852 +0000 UTC m=+18.244481395 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ftg87" (UniqueName: "kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87") pod "network-check-target-s8rrk" (UID: "fde7e0f5-379e-4950-a766-5b94afe18049") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:43:59.719765 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.719739 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mlpnc" Mar 18 16:43:59.939749 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.937577 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:43:59.939749 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:59.937690 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:43:59.939749 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:43:59.938108 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:43:59.939749 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:43:59.938196 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:01.937186 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:01.937162 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:01.937653 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:01.937276 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:01.937653 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:01.937347 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:01.937653 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:01.937427 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:03.936849 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:03.936813 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:03.936849 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:03.936855 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:03.937297 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:03.936929 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:03.937297 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:03.937055 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:05.939485 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:05.939459 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:05.939485 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:05.939474 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:05.939897 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:05.939577 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:05.939897 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:05.939651 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:06.824659 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:06.824625 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-svgkn"] Mar 18 16:44:06.852016 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:06.851993 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:06.852164 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:06.852063 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgkn" podUID="93ac321e-d058-4c0a-9994-91d01edf06db" Mar 18 16:44:06.909960 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:06.909931 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:06.910091 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:06.910019 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/93ac321e-d058-4c0a-9994-91d01edf06db-dbus\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:06.910091 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:06.910051 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/93ac321e-d058-4c0a-9994-91d01edf06db-kubelet-config\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:07.010663 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:07.010632 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/93ac321e-d058-4c0a-9994-91d01edf06db-kubelet-config\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:07.011062 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:07.010681 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:07.011062 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:07.010752 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/93ac321e-d058-4c0a-9994-91d01edf06db-dbus\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:07.011062 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:07.010752 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/93ac321e-d058-4c0a-9994-91d01edf06db-kubelet-config\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:07.011062 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:07.010875 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/93ac321e-d058-4c0a-9994-91d01edf06db-dbus\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:07.011062 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.010880 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:07.011062 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.010955 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret podName:93ac321e-d058-4c0a-9994-91d01edf06db nodeName:}" failed. No retries permitted until 2026-03-18 16:44:07.510935147 +0000 UTC m=+18.102875689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret") pod "global-pull-secret-syncer-svgkn" (UID: "93ac321e-d058-4c0a-9994-91d01edf06db") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:07.514084 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:07.514047 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:07.514233 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.514172 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:07.514298 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.514233 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret podName:93ac321e-d058-4c0a-9994-91d01edf06db nodeName:}" failed. No retries permitted until 2026-03-18 16:44:08.514215634 +0000 UTC m=+19.106156171 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret") pod "global-pull-secret-syncer-svgkn" (UID: "93ac321e-d058-4c0a-9994-91d01edf06db") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:07.614512 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:07.614476 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:07.614670 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.614624 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:07.614726 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.614680 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs podName:c0b2d8c5-09a4-472a-aad2-fa033be042f3 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.614664167 +0000 UTC m=+34.206604716 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs") pod "network-metrics-daemon-kbxwz" (UID: "c0b2d8c5-09a4-472a-aad2-fa033be042f3") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 16:44:07.715297 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:07.715250 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftg87\" (UniqueName: \"kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87\") pod \"network-check-target-s8rrk\" (UID: \"fde7e0f5-379e-4950-a766-5b94afe18049\") " pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:07.715468 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.715402 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 16:44:07.715468 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.715418 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 16:44:07.715468 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.715430 2563 projected.go:194] Error preparing data for projected volume kube-api-access-ftg87 for pod openshift-network-diagnostics/network-check-target-s8rrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:07.715607 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.715489 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87 podName:fde7e0f5-379e-4950-a766-5b94afe18049 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.715471631 +0000 UTC m=+34.307412177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ftg87" (UniqueName: "kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87") pod "network-check-target-s8rrk" (UID: "fde7e0f5-379e-4950-a766-5b94afe18049") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 16:44:07.936493 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:07.936420 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:07.936712 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:07.936420 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:07.936712 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.936528 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:07.936712 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:07.936587 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:08.521536 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:08.521480 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:08.521963 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:08.521625 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:08.521963 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:08.521699 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret podName:93ac321e-d058-4c0a-9994-91d01edf06db nodeName:}" failed. No retries permitted until 2026-03-18 16:44:10.521679145 +0000 UTC m=+21.113619675 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret") pod "global-pull-secret-syncer-svgkn" (UID: "93ac321e-d058-4c0a-9994-91d01edf06db") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:08.936878 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:08.936813 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:08.937008 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:08.936927 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgkn" podUID="93ac321e-d058-4c0a-9994-91d01edf06db" Mar 18 16:44:09.456349 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:44:09.456325 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2859c82_5f61_4503_bb69_2147a77ca895.slice/crio-b5afce17dea224e1e052bba7c16e3cfe4377a500d59180b155624dcbbd8943c2 WatchSource:0}: Error finding container b5afce17dea224e1e052bba7c16e3cfe4377a500d59180b155624dcbbd8943c2: Status 404 returned error can't find the container with id b5afce17dea224e1e052bba7c16e3cfe4377a500d59180b155624dcbbd8943c2 Mar 18 16:44:09.937030 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:09.936874 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:09.937754 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:09.937082 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:09.937754 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:09.936968 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:09.937754 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:09.937224 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:10.003622 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.003452 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dsml5" event={"ID":"7ef73172-0617-4d44-b9f4-f5d3832924d2","Type":"ContainerStarted","Data":"afa90c5d822d0afe78b09382a019432e42e482b241c382609ba6767a74d86cb2"} Mar 18 16:44:10.005141 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.005111 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" event={"ID":"c2da91ba-0645-46a4-a59d-b7219ba40de9","Type":"ContainerStarted","Data":"fef3dfd89a5913700e8d8262d16a72c7a35359bac42129da3bbf6a1c7f58af2c"} Mar 18 16:44:10.005253 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.005151 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" event={"ID":"c2da91ba-0645-46a4-a59d-b7219ba40de9","Type":"ContainerStarted","Data":"46f80bbb6ce1987c0864c8feeae684e7cd873d127948f122c3b0570dbe968388"} Mar 18 16:44:10.006512 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.006470 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" event={"ID":"41131912-91a1-43ad-a23a-203bd6091794","Type":"ContainerStarted","Data":"3bfad37b73a1dfb86ba37b7def08b56f1a807eb62738ec358604c9179cbf0a9a"} Mar 18 16:44:10.008121 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.008095 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" event={"ID":"e02e2a33-e5cb-4c07-8e13-ed2992b81a67","Type":"ContainerStarted","Data":"8f2b6e8ed1e876bf400a82f673d69bbca54adb656a27fb9047a0e5086db8dfc9"} Mar 18 16:44:10.009426 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.009405 2563 generic.go:358] "Generic (PLEG): container finished" podID="ef8cdb92-dfe0-4b41-8256-a466ea85d67a" containerID="4923d8e5475f7e2f7848a5a8934c730e5b8f78e24a757007314595a7fcc3745a" exitCode=0 Mar 18 16:44:10.009538 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.009465 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" event={"ID":"ef8cdb92-dfe0-4b41-8256-a466ea85d67a","Type":"ContainerDied","Data":"4923d8e5475f7e2f7848a5a8934c730e5b8f78e24a757007314595a7fcc3745a"} Mar 18 16:44:10.010846 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.010810 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-glstl" event={"ID":"777f8e63-1d9b-4424-b5e7-62a6ccb4658f","Type":"ContainerStarted","Data":"6a1b190e5d486c98ddbb4a22ad83cca6f233ea601ba1ed68fa3aad96ba40e8b4"} Mar 18 16:44:10.012130 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.012099 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f9chd" event={"ID":"4aac1f96-b5de-49d5-84a3-176806d103dc","Type":"ContainerStarted","Data":"362b1be8c9be202a940a383981eb8a445b4911574c6b777c4a64b9576981d51b"} Mar 18 16:44:10.013265 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.013241 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mlpnc" event={"ID":"d2859c82-5f61-4503-bb69-2147a77ca895","Type":"ContainerStarted","Data":"061d060351f6529150fcc072a50810f94fc1db070ab8e525a3041db5f212fa7a"} Mar 18 16:44:10.013327 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.013270 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mlpnc" event={"ID":"d2859c82-5f61-4503-bb69-2147a77ca895","Type":"ContainerStarted","Data":"b5afce17dea224e1e052bba7c16e3cfe4377a500d59180b155624dcbbd8943c2"} Mar 18 16:44:10.015933 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.015896 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dsml5" podStartSLOduration=3.223051694 podStartE2EDuration="20.015881907s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.651134515 +0000 UTC m=+3.243075044" lastFinishedPulling="2026-03-18 16:44:09.443964713 +0000 UTC m=+20.035905257" observedRunningTime="2026-03-18 16:44:10.015533859 +0000 UTC m=+20.607474401" watchObservedRunningTime="2026-03-18 16:44:10.015881907 +0000 UTC m=+20.607822457" Mar 18 16:44:10.025724 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.025680 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-f9chd" podStartSLOduration=3.221841267 podStartE2EDuration="20.025665397s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.641613912 +0000 UTC m=+3.233554441" lastFinishedPulling="2026-03-18 16:44:09.445438028 +0000 UTC m=+20.037378571" observedRunningTime="2026-03-18 16:44:10.025006042 +0000 UTC m=+20.616946609" watchObservedRunningTime="2026-03-18 16:44:10.025665397 +0000 UTC m=+20.617605950" Mar 18 16:44:10.050320 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.050274 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-glstl" podStartSLOduration=3.22532678 podStartE2EDuration="20.050262316s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.650104529 +0000 UTC m=+3.242045064" lastFinishedPulling="2026-03-18 16:44:09.475040071 +0000 UTC m=+20.066980600" observedRunningTime="2026-03-18 16:44:10.049898508 +0000 UTC m=+20.641839061" watchObservedRunningTime="2026-03-18 16:44:10.050262316 +0000 UTC m=+20.642202866" Mar 18 16:44:10.066715 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.066672 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mlpnc" podStartSLOduration=11.066656885 podStartE2EDuration="11.066656885s" podCreationTimestamp="2026-03-18 16:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:44:10.066411646 +0000 UTC m=+20.658352200" watchObservedRunningTime="2026-03-18 16:44:10.066656885 +0000 UTC m=+20.658597437" Mar 18 16:44:10.537661 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.537632 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:10.537774 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:10.537738 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:10.537839 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:10.537787 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret podName:93ac321e-d058-4c0a-9994-91d01edf06db nodeName:}" failed. No retries permitted until 2026-03-18 16:44:14.537775658 +0000 UTC m=+25.129716192 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret") pod "global-pull-secret-syncer-svgkn" (UID: "93ac321e-d058-4c0a-9994-91d01edf06db") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:10.563993 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.563967 2563 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 18 16:44:10.936446 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.936372 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:10.936605 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:10.936522 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgkn" podUID="93ac321e-d058-4c0a-9994-91d01edf06db" Mar 18 16:44:10.941590 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.941465 2563 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-18T16:44:10.563990468Z","UUID":"5cd9b496-e977-4b7b-9812-2d5341d0fb0b","Handler":null,"Name":"","Endpoint":""} Mar 18 16:44:10.943927 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.943901 2563 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 18 16:44:10.943927 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:10.943930 2563 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 18 16:44:11.017419 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:11.017390 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nztqh" event={"ID":"2e2ef5c8-0952-4eef-a0d9-19656f20a5a5","Type":"ContainerStarted","Data":"56906914ad6dbf12dc9b6f3ca72d60642a61ff4e3e72c8f96a7bd25ca66e3731"} Mar 18 16:44:11.020632 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:11.020605 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" event={"ID":"c2da91ba-0645-46a4-a59d-b7219ba40de9","Type":"ContainerStarted","Data":"a7b9bd55bd8a9f9f9633c1c68d7b02e322822713277c669ab389186d76a77730"} Mar 18 16:44:11.020728 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:11.020638 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" event={"ID":"c2da91ba-0645-46a4-a59d-b7219ba40de9","Type":"ContainerStarted","Data":"abeff6284fe46fc739f8acd5dabc9109354b04e96db9cc4259f935dad5c5a15a"} Mar 18 16:44:11.020728 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:11.020653 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" event={"ID":"c2da91ba-0645-46a4-a59d-b7219ba40de9","Type":"ContainerStarted","Data":"6d8cba1ff7c099c96abd4aea273be66c114291e0acec19dc935c838b86c35061"} Mar 18 16:44:11.020728 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:11.020665 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" event={"ID":"c2da91ba-0645-46a4-a59d-b7219ba40de9","Type":"ContainerStarted","Data":"03ad5a59167560e7599f0cce804a8e2030ce53fda96cd29671d750affffe7000"} Mar 18 16:44:11.023114 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:11.023081 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" event={"ID":"e02e2a33-e5cb-4c07-8e13-ed2992b81a67","Type":"ContainerStarted","Data":"b7f5171e18d5d4945120653c8fda215da1e5d4c0616009996b487bf8dda1ea93"} Mar 18 16:44:11.029532 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:11.029453 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lxcdc" podStartSLOduration=4.233535233 podStartE2EDuration="21.029436914s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.647878092 +0000 UTC m=+3.239818625" lastFinishedPulling="2026-03-18 16:44:09.443779777 +0000 UTC m=+20.035720306" observedRunningTime="2026-03-18 16:44:10.07894231 +0000 UTC m=+20.670882863" watchObservedRunningTime="2026-03-18 16:44:11.029436914 +0000 UTC m=+21.621377465" Mar 18 16:44:11.936839 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:11.936642 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:11.937099 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:11.936697 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:11.937099 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:11.936923 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:11.937099 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:11.936967 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:12.027020 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:12.026984 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" event={"ID":"e02e2a33-e5cb-4c07-8e13-ed2992b81a67","Type":"ContainerStarted","Data":"bc0eb0a98af1d3d125eb2c2cf5a8f4773e124173f6a04552da295f6c2f6cb82b"} Mar 18 16:44:12.062185 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:12.062141 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mvvf6" podStartSLOduration=3.392067776 podStartE2EDuration="22.062129101s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.640053487 +0000 UTC m=+3.231994036" lastFinishedPulling="2026-03-18 16:44:11.310114824 +0000 UTC m=+21.902055361" observedRunningTime="2026-03-18 16:44:12.061859576 +0000 UTC m=+22.653800126" watchObservedRunningTime="2026-03-18 16:44:12.062129101 +0000 UTC m=+22.654069673" Mar 18 16:44:12.062410 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:12.062382 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nztqh" podStartSLOduration=5.268790835 podStartE2EDuration="22.062375725s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.650212467 +0000 UTC m=+3.242152995" lastFinishedPulling="2026-03-18 16:44:09.443797355 +0000 UTC m=+20.035737885" observedRunningTime="2026-03-18 16:44:11.029575012 +0000 UTC m=+21.621515547" watchObservedRunningTime="2026-03-18 16:44:12.062375725 +0000 UTC m=+22.654316352" Mar 18 16:44:12.504434 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:12.504407 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:44:12.504948 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:12.504931 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:44:12.936539 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:12.936446 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:12.936725 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:12.936583 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgkn" podUID="93ac321e-d058-4c0a-9994-91d01edf06db" Mar 18 16:44:13.029291 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:13.029263 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:44:13.029823 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:13.029572 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dsml5" Mar 18 16:44:13.936422 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:13.936389 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:13.936617 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:13.936546 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:13.936617 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:13.936590 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:13.936736 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:13.936674 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:14.571723 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:14.571646 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:14.572284 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:14.571802 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:14.572284 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:14.571870 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret podName:93ac321e-d058-4c0a-9994-91d01edf06db nodeName:}" failed. No retries permitted until 2026-03-18 16:44:22.571855226 +0000 UTC m=+33.163795760 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret") pod "global-pull-secret-syncer-svgkn" (UID: "93ac321e-d058-4c0a-9994-91d01edf06db") : object "kube-system"/"original-pull-secret" not registered Mar 18 16:44:14.936633 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:14.936611 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:14.936781 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:14.936693 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgkn" podUID="93ac321e-d058-4c0a-9994-91d01edf06db" Mar 18 16:44:15.035219 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:15.035189 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" event={"ID":"c2da91ba-0645-46a4-a59d-b7219ba40de9","Type":"ContainerStarted","Data":"de593ec15a17ac65b639d519a28a48d8af3740c9e284d3fcf33ececbb02ccfdc"} Mar 18 16:44:15.036790 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:15.036767 2563 generic.go:358] "Generic (PLEG): container finished" podID="ef8cdb92-dfe0-4b41-8256-a466ea85d67a" containerID="2c397427814b30d33c9b012efa4e8fd6f9d87cc821df4bf522c973d353adf764" exitCode=0 Mar 18 16:44:15.036891 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:15.036855 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" event={"ID":"ef8cdb92-dfe0-4b41-8256-a466ea85d67a","Type":"ContainerDied","Data":"2c397427814b30d33c9b012efa4e8fd6f9d87cc821df4bf522c973d353adf764"} Mar 18 16:44:15.936522 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:15.936485 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:15.936522 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:15.936515 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:15.936801 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:15.936597 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:15.936801 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:15.936731 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:16.039702 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:16.039645 2563 generic.go:358] "Generic (PLEG): container finished" podID="ef8cdb92-dfe0-4b41-8256-a466ea85d67a" containerID="0ecdc7715bb8c2bc81eccecebfeb1072475903730f003c36dd9369d11c1ba5e1" exitCode=0 Mar 18 16:44:16.039702 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:16.039679 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" event={"ID":"ef8cdb92-dfe0-4b41-8256-a466ea85d67a","Type":"ContainerDied","Data":"0ecdc7715bb8c2bc81eccecebfeb1072475903730f003c36dd9369d11c1ba5e1"} Mar 18 16:44:16.937170 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:16.936997 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:16.937551 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:16.937238 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgkn" podUID="93ac321e-d058-4c0a-9994-91d01edf06db" Mar 18 16:44:17.044005 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.043973 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" event={"ID":"c2da91ba-0645-46a4-a59d-b7219ba40de9","Type":"ContainerStarted","Data":"8e530e9273e8f6d906c6cea05e7a425ac810bf244a5040b7f468a2fdc1f2d0b8"} Mar 18 16:44:17.045096 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.044901 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:44:17.045096 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.045039 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:44:17.045096 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.045067 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:44:17.049601 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.049577 2563 generic.go:358] "Generic (PLEG): container finished" podID="ef8cdb92-dfe0-4b41-8256-a466ea85d67a" containerID="1229bdb1fcf81aa6a8b9829c405a149253ff5765c87f2efb79f15b1aec15abec" exitCode=0 Mar 18 16:44:17.049696 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.049616 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" event={"ID":"ef8cdb92-dfe0-4b41-8256-a466ea85d67a","Type":"ContainerDied","Data":"1229bdb1fcf81aa6a8b9829c405a149253ff5765c87f2efb79f15b1aec15abec"} Mar 18 16:44:17.061255 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.061237 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:44:17.061337 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.061295 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:44:17.075485 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.075449 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" podStartSLOduration=9.885115808 podStartE2EDuration="27.075437475s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.649032785 +0000 UTC m=+3.240973314" lastFinishedPulling="2026-03-18 16:44:09.839354452 +0000 UTC m=+20.431294981" observedRunningTime="2026-03-18 16:44:17.074925631 +0000 UTC m=+27.666866176" watchObservedRunningTime="2026-03-18 16:44:17.075437475 +0000 UTC m=+27.667378025" Mar 18 16:44:17.937537 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.937487 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:17.937919 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:17.937740 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:17.937919 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:17.937791 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:17.938011 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:17.937922 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:18.050913 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:18.050878 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-svgkn"] Mar 18 16:44:18.051338 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:18.051011 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:18.051338 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:18.051123 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgkn" podUID="93ac321e-d058-4c0a-9994-91d01edf06db" Mar 18 16:44:18.054211 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:18.054185 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s8rrk"] Mar 18 16:44:18.054335 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:18.054267 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:18.054393 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:18.054348 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:18.065913 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:18.065874 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kbxwz"] Mar 18 16:44:18.066061 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:18.065967 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:18.066182 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:18.066066 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:19.937456 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:19.937426 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:19.937872 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:19.937532 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:19.937872 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:19.937571 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:19.937872 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:19.937561 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:44:19.937872 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:19.937646 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8rrk" podUID="fde7e0f5-379e-4950-a766-5b94afe18049" Mar 18 16:44:19.937872 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:19.937724 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-svgkn" podUID="93ac321e-d058-4c0a-9994-91d01edf06db" Mar 18 16:44:21.695915 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.695694 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-231.ec2.internal" event="NodeReady" Mar 18 16:44:21.696375 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.696023 2563 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 18 16:44:21.728764 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.728731 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-64977ff748-pw6tv"] Mar 18 16:44:21.765300 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.765275 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-kqsnz"] Mar 18 16:44:21.765439 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.765422 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:21.767440 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.767413 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 18 16:44:21.767574 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.767462 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rcgfm\"" Mar 18 16:44:21.767574 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.767477 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 18 16:44:21.767847 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.767422 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 18 16:44:21.773074 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.773056 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 18 16:44:21.791671 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.791645 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64977ff748-pw6tv"] Mar 18 16:44:21.791671 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.791672 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-kqsnz"] Mar 18 16:44:21.791800 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.791688 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kqxqw"] Mar 18 16:44:21.791868 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.791805 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:21.793778 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.793751 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 18 16:44:21.793778 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.793773 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 18 16:44:21.794061 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.794041 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-glssf\"" Mar 18 16:44:21.822098 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.822079 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rzxp5"] Mar 18 16:44:21.822267 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.822250 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:21.825098 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.825075 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 18 16:44:21.825197 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.825107 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 18 16:44:21.825197 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.825116 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-775g5\"" Mar 18 16:44:21.840560 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.840540 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kqxqw"] Mar 18 16:44:21.840560 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.840562 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rzxp5"] Mar 18 16:44:21.840684 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.840662 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:21.843053 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.843025 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 18 16:44:21.843053 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.843035 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 18 16:44:21.843187 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.843035 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tcvm2\"" Mar 18 16:44:21.843187 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.843023 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 18 16:44:21.931544 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931489 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-registry-certificates\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:21.931699 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931557 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-trusted-ca\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:21.931699 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931584 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-bound-sa-token\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:21.931699 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931609 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a2dba0-4d01-448c-accb-07510f0c8197-config-volume\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:21.931699 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931638 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/81a2dba0-4d01-448c-accb-07510f0c8197-tmp-dir\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:21.931913 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931702 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:21.931913 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931785 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:21.931913 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931863 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjptd\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-kube-api-access-fjptd\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:21.931913 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931891 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/491e0153-8caa-4407-b31c-f8618b35079b-ca-trust-extracted\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:21.931913 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931912 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:21.932146 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931945 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:21.932146 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.931971 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7spfx\" (UniqueName: \"kubernetes.io/projected/81a2dba0-4d01-448c-accb-07510f0c8197-kube-api-access-7spfx\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:21.932146 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.932007 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-image-registry-private-configuration\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:21.932146 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.932063 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/19412877-e39c-4727-8a7e-12bcaf2b8450-nginx-conf\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:21.932146 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.932104 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhgxb\" (UniqueName: \"kubernetes.io/projected/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-kube-api-access-xhgxb\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:21.932146 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.932134 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-installation-pull-secrets\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:21.936603 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.936580 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:21.936744 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.936704 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:21.936744 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.936579 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:21.939178 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.938964 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 18 16:44:21.939178 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.938985 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 18 16:44:21.939178 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.938990 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrjl4\"" Mar 18 16:44:21.939178 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.939007 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 18 16:44:21.939178 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.939017 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 18 16:44:21.939460 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:21.939359 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gnx58\"" Mar 18 16:44:22.033253 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033228 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhgxb\" (UniqueName: \"kubernetes.io/projected/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-kube-api-access-xhgxb\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:22.033427 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033274 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-installation-pull-secrets\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.033526 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033437 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-registry-certificates\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.033526 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033474 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-trusted-ca\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.033526 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033518 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-bound-sa-token\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.033684 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033543 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a2dba0-4d01-448c-accb-07510f0c8197-config-volume\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:22.033684 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033570 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/81a2dba0-4d01-448c-accb-07510f0c8197-tmp-dir\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:22.033684 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033613 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:22.033684 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033652 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.033870 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033687 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fjptd\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-kube-api-access-fjptd\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.033870 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033722 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/491e0153-8caa-4407-b31c-f8618b35079b-ca-trust-extracted\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.033870 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033746 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:22.033870 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.033839 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:44:22.033870 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.033861 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:22.034095 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.033908 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert podName:19412877-e39c-4727-8a7e-12bcaf2b8450 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:22.533879764 +0000 UTC m=+33.125820297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-kqsnz" (UID: "19412877-e39c-4727-8a7e-12bcaf2b8450") : secret "networking-console-plugin-cert" not found Mar 18 16:44:22.034095 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.033929 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls podName:81a2dba0-4d01-448c-accb-07510f0c8197 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:22.533917447 +0000 UTC m=+33.125857984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls") pod "dns-default-kqxqw" (UID: "81a2dba0-4d01-448c-accb-07510f0c8197") : secret "dns-default-metrics-tls" not found Mar 18 16:44:22.034095 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033951 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:22.034095 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.033983 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7spfx\" (UniqueName: \"kubernetes.io/projected/81a2dba0-4d01-448c-accb-07510f0c8197-kube-api-access-7spfx\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:22.034095 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.034025 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-image-registry-private-configuration\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.034095 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.034073 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/19412877-e39c-4727-8a7e-12bcaf2b8450-nginx-conf\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:22.034381 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.034207 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/81a2dba0-4d01-448c-accb-07510f0c8197-tmp-dir\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:22.034381 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.034211 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-registry-certificates\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.034381 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.034305 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:22.034381 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.034355 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert podName:f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:22.534337183 +0000 UTC m=+33.126277723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert") pod "ingress-canary-rzxp5" (UID: "f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c") : secret "canary-serving-cert" not found Mar 18 16:44:22.034597 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.034425 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81a2dba0-4d01-448c-accb-07510f0c8197-config-volume\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:22.034597 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.034521 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:22.034597 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.034534 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64977ff748-pw6tv: secret "image-registry-tls" not found Mar 18 16:44:22.034597 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.034557 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-trusted-ca\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.034597 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.034585 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls podName:491e0153-8caa-4407-b31c-f8618b35079b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:22.534570039 +0000 UTC m=+33.126510573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls") pod "image-registry-64977ff748-pw6tv" (UID: "491e0153-8caa-4407-b31c-f8618b35079b") : secret "image-registry-tls" not found Mar 18 16:44:22.034597 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.034587 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/491e0153-8caa-4407-b31c-f8618b35079b-ca-trust-extracted\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.034865 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.034729 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/19412877-e39c-4727-8a7e-12bcaf2b8450-nginx-conf\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:22.038054 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.038032 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-image-registry-private-configuration\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.038146 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.038091 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-installation-pull-secrets\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.050413 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.050387 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7spfx\" (UniqueName: \"kubernetes.io/projected/81a2dba0-4d01-448c-accb-07510f0c8197-kube-api-access-7spfx\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:22.051392 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.051366 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhgxb\" (UniqueName: \"kubernetes.io/projected/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-kube-api-access-xhgxb\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:22.051577 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.051556 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-bound-sa-token\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.053445 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.053424 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjptd\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-kube-api-access-fjptd\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.537091 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.537049 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:22.537250 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.537100 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:22.537250 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.537140 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:22.537250 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.537174 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:22.537250 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.537202 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:44:22.537250 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.537244 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:22.537396 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.537271 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert podName:19412877-e39c-4727-8a7e-12bcaf2b8450 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.537250384 +0000 UTC m=+34.129190914 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-kqsnz" (UID: "19412877-e39c-4727-8a7e-12bcaf2b8450") : secret "networking-console-plugin-cert" not found Mar 18 16:44:22.537396 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.537272 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:22.537396 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.537288 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert podName:f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.537277729 +0000 UTC m=+34.129218262 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert") pod "ingress-canary-rzxp5" (UID: "f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c") : secret "canary-serving-cert" not found Mar 18 16:44:22.537396 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.537297 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64977ff748-pw6tv: secret "image-registry-tls" not found Mar 18 16:44:22.537396 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.537280 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:22.537396 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.537369 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls podName:81a2dba0-4d01-448c-accb-07510f0c8197 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.537349299 +0000 UTC m=+34.129289835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls") pod "dns-default-kqxqw" (UID: "81a2dba0-4d01-448c-accb-07510f0c8197") : secret "dns-default-metrics-tls" not found Mar 18 16:44:22.537396 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:22.537387 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls podName:491e0153-8caa-4407-b31c-f8618b35079b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:23.537381027 +0000 UTC m=+34.129321560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls") pod "image-registry-64977ff748-pw6tv" (UID: "491e0153-8caa-4407-b31c-f8618b35079b") : secret "image-registry-tls" not found Mar 18 16:44:22.638806 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.638770 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:22.640898 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.640872 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/93ac321e-d058-4c0a-9994-91d01edf06db-original-pull-secret\") pod \"global-pull-secret-syncer-svgkn\" (UID: \"93ac321e-d058-4c0a-9994-91d01edf06db\") " pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:22.849008 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:22.848943 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-svgkn" Mar 18 16:44:23.091902 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:23.091742 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-svgkn"] Mar 18 16:44:23.094968 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:44:23.094946 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93ac321e_d058_4c0a_9994_91d01edf06db.slice/crio-bf237dd2f67e5c89e50052bad692a482df242c10581422e2e5e298da246abd3a WatchSource:0}: Error finding container bf237dd2f67e5c89e50052bad692a482df242c10581422e2e5e298da246abd3a: Status 404 returned error can't find the container with id bf237dd2f67e5c89e50052bad692a482df242c10581422e2e5e298da246abd3a Mar 18 16:44:23.545623 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:23.545555 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:23.545623 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:23.545589 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:23.545623 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:23.545612 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:23.545911 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:23.545636 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:23.545911 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.545701 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:44:23.545911 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.545719 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:23.545911 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.545738 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64977ff748-pw6tv: secret "image-registry-tls" not found Mar 18 16:44:23.545911 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.545721 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:23.545911 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.545750 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:23.545911 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.545771 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert podName:19412877-e39c-4727-8a7e-12bcaf2b8450 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:25.545752232 +0000 UTC m=+36.137692777 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-kqsnz" (UID: "19412877-e39c-4727-8a7e-12bcaf2b8450") : secret "networking-console-plugin-cert" not found Mar 18 16:44:23.545911 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.545789 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert podName:f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:25.545778404 +0000 UTC m=+36.137718932 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert") pod "ingress-canary-rzxp5" (UID: "f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c") : secret "canary-serving-cert" not found Mar 18 16:44:23.545911 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.545803 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls podName:81a2dba0-4d01-448c-accb-07510f0c8197 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:25.545795672 +0000 UTC m=+36.137736201 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls") pod "dns-default-kqxqw" (UID: "81a2dba0-4d01-448c-accb-07510f0c8197") : secret "dns-default-metrics-tls" not found Mar 18 16:44:23.545911 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.545815 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls podName:491e0153-8caa-4407-b31c-f8618b35079b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:25.545808715 +0000 UTC m=+36.137749244 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls") pod "image-registry-64977ff748-pw6tv" (UID: "491e0153-8caa-4407-b31c-f8618b35079b") : secret "image-registry-tls" not found Mar 18 16:44:23.646694 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:23.646661 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:23.646843 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.646797 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:44:23.646899 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:23.646859 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs podName:c0b2d8c5-09a4-472a-aad2-fa033be042f3 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:55.646841107 +0000 UTC m=+66.238781644 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs") pod "network-metrics-daemon-kbxwz" (UID: "c0b2d8c5-09a4-472a-aad2-fa033be042f3") : secret "metrics-daemon-secret" not found Mar 18 16:44:23.747689 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:23.747660 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftg87\" (UniqueName: \"kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87\") pod \"network-check-target-s8rrk\" (UID: \"fde7e0f5-379e-4950-a766-5b94afe18049\") " pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:23.751168 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:23.751145 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftg87\" (UniqueName: \"kubernetes.io/projected/fde7e0f5-379e-4950-a766-5b94afe18049-kube-api-access-ftg87\") pod \"network-check-target-s8rrk\" (UID: \"fde7e0f5-379e-4950-a766-5b94afe18049\") " pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:23.759995 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:23.759959 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:23.886276 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:23.886242 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s8rrk"] Mar 18 16:44:23.889964 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:44:23.889936 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfde7e0f5_379e_4950_a766_5b94afe18049.slice/crio-d6971534fe397afa522f79c4ff6cb556e10a46b59e8d846a6d9c9b565c7ac8d5 WatchSource:0}: Error finding container d6971534fe397afa522f79c4ff6cb556e10a46b59e8d846a6d9c9b565c7ac8d5: Status 404 returned error can't find the container with id d6971534fe397afa522f79c4ff6cb556e10a46b59e8d846a6d9c9b565c7ac8d5 Mar 18 16:44:24.071735 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:24.071672 2563 generic.go:358] "Generic (PLEG): container finished" podID="ef8cdb92-dfe0-4b41-8256-a466ea85d67a" containerID="b1800ff942a24a3370a0ed06c3a2e067fdd917293218b85b4b601aa4912a963f" exitCode=0 Mar 18 16:44:24.071866 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:24.071752 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" event={"ID":"ef8cdb92-dfe0-4b41-8256-a466ea85d67a","Type":"ContainerDied","Data":"b1800ff942a24a3370a0ed06c3a2e067fdd917293218b85b4b601aa4912a963f"} Mar 18 16:44:24.072898 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:24.072871 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s8rrk" event={"ID":"fde7e0f5-379e-4950-a766-5b94afe18049","Type":"ContainerStarted","Data":"d6971534fe397afa522f79c4ff6cb556e10a46b59e8d846a6d9c9b565c7ac8d5"} Mar 18 16:44:24.073946 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:24.073924 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-svgkn" event={"ID":"93ac321e-d058-4c0a-9994-91d01edf06db","Type":"ContainerStarted","Data":"bf237dd2f67e5c89e50052bad692a482df242c10581422e2e5e298da246abd3a"} Mar 18 16:44:25.080484 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:25.080447 2563 generic.go:358] "Generic (PLEG): container finished" podID="ef8cdb92-dfe0-4b41-8256-a466ea85d67a" containerID="57a086e08b26d072c414db35454deb5fa8d330a4bcbf5d57194358e33e040d1a" exitCode=0 Mar 18 16:44:25.080945 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:25.080540 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" event={"ID":"ef8cdb92-dfe0-4b41-8256-a466ea85d67a","Type":"ContainerDied","Data":"57a086e08b26d072c414db35454deb5fa8d330a4bcbf5d57194358e33e040d1a"} Mar 18 16:44:25.565135 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:25.564912 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:25.565305 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:25.565160 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:25.565305 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:25.565183 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:44:25.565305 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:25.565204 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:25.565305 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:25.565241 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:25.565305 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:25.565257 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert podName:19412877-e39c-4727-8a7e-12bcaf2b8450 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:29.565236496 +0000 UTC m=+40.157177027 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-kqsnz" (UID: "19412877-e39c-4727-8a7e-12bcaf2b8450") : secret "networking-console-plugin-cert" not found Mar 18 16:44:25.565305 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:25.565292 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:25.565305 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:25.565303 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:25.565305 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:25.565308 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64977ff748-pw6tv: secret "image-registry-tls" not found Mar 18 16:44:25.565728 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:25.565348 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert podName:f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:29.565333438 +0000 UTC m=+40.157273968 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert") pod "ingress-canary-rzxp5" (UID: "f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c") : secret "canary-serving-cert" not found Mar 18 16:44:25.565728 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:25.565364 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls podName:491e0153-8caa-4407-b31c-f8618b35079b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:29.565356195 +0000 UTC m=+40.157296727 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls") pod "image-registry-64977ff748-pw6tv" (UID: "491e0153-8caa-4407-b31c-f8618b35079b") : secret "image-registry-tls" not found Mar 18 16:44:25.565728 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:25.565389 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:25.565728 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:25.565425 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls podName:81a2dba0-4d01-448c-accb-07510f0c8197 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:29.565413513 +0000 UTC m=+40.157354059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls") pod "dns-default-kqxqw" (UID: "81a2dba0-4d01-448c-accb-07510f0c8197") : secret "dns-default-metrics-tls" not found Mar 18 16:44:26.086245 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:26.086213 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" event={"ID":"ef8cdb92-dfe0-4b41-8256-a466ea85d67a","Type":"ContainerStarted","Data":"bb62d5deb728794b09dcefc1eda6b39384b78d5f6fcb685e2c52fe501347cf93"} Mar 18 16:44:26.110959 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:26.110900 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jxfqg" podStartSLOduration=5.806616061 podStartE2EDuration="36.110880836s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:43:52.653679787 +0000 UTC m=+3.245620319" lastFinishedPulling="2026-03-18 16:44:22.957944556 +0000 UTC m=+33.549885094" observedRunningTime="2026-03-18 16:44:26.110697277 +0000 UTC m=+36.702637830" watchObservedRunningTime="2026-03-18 16:44:26.110880836 +0000 UTC m=+36.702821392" Mar 18 16:44:29.093534 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:29.093489 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-svgkn" event={"ID":"93ac321e-d058-4c0a-9994-91d01edf06db","Type":"ContainerStarted","Data":"32c3e77c212a3a7bad16d61e09f8a024f3b60d91e81f4814683998949dbc2b59"} Mar 18 16:44:29.094930 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:29.094908 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s8rrk" event={"ID":"fde7e0f5-379e-4950-a766-5b94afe18049","Type":"ContainerStarted","Data":"505971a3b0538cd9c867dc1679e79b9a664d7aeb04988eb60d5baf831812cb5a"} Mar 18 16:44:29.095037 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:29.095025 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:44:29.106688 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:29.106646 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-svgkn" podStartSLOduration=17.790912811 podStartE2EDuration="23.106633754s" podCreationTimestamp="2026-03-18 16:44:06 +0000 UTC" firstStartedPulling="2026-03-18 16:44:23.096938265 +0000 UTC m=+33.688878796" lastFinishedPulling="2026-03-18 16:44:28.412659194 +0000 UTC m=+39.004599739" observedRunningTime="2026-03-18 16:44:29.10587414 +0000 UTC m=+39.697814691" watchObservedRunningTime="2026-03-18 16:44:29.106633754 +0000 UTC m=+39.698574328" Mar 18 16:44:29.123018 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:29.122979 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-s8rrk" podStartSLOduration=34.594138413 podStartE2EDuration="39.122966862s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:44:23.892054888 +0000 UTC m=+34.483995422" lastFinishedPulling="2026-03-18 16:44:28.420883325 +0000 UTC m=+39.012823871" observedRunningTime="2026-03-18 16:44:29.121969108 +0000 UTC m=+39.713909660" watchObservedRunningTime="2026-03-18 16:44:29.122966862 +0000 UTC m=+39.714907438" Mar 18 16:44:29.597374 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:29.597340 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:29.597374 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:29.597378 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:29.597401 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:29.597422 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:29.597492 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:29.597527 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64977ff748-pw6tv: secret "image-registry-tls" not found Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:29.597533 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:29.597546 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:29.597559 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:29.597583 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert podName:f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:37.597570642 +0000 UTC m=+48.189511172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert") pod "ingress-canary-rzxp5" (UID: "f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c") : secret "canary-serving-cert" not found Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:29.597596 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls podName:491e0153-8caa-4407-b31c-f8618b35079b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:37.597590067 +0000 UTC m=+48.189530596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls") pod "image-registry-64977ff748-pw6tv" (UID: "491e0153-8caa-4407-b31c-f8618b35079b") : secret "image-registry-tls" not found Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:29.597606 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls podName:81a2dba0-4d01-448c-accb-07510f0c8197 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:37.597600424 +0000 UTC m=+48.189540953 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls") pod "dns-default-kqxqw" (UID: "81a2dba0-4d01-448c-accb-07510f0c8197") : secret "dns-default-metrics-tls" not found Mar 18 16:44:29.597645 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:29.597616 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert podName:19412877-e39c-4727-8a7e-12bcaf2b8450 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:37.597610557 +0000 UTC m=+48.189551086 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-kqsnz" (UID: "19412877-e39c-4727-8a7e-12bcaf2b8450") : secret "networking-console-plugin-cert" not found Mar 18 16:44:37.651493 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:37.651454 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:37.651514 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:37.651542 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:37.651563 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:37.651609 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:37.651633 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64977ff748-pw6tv: secret "image-registry-tls" not found Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:37.651682 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls podName:491e0153-8caa-4407-b31c-f8618b35079b nodeName:}" failed. No retries permitted until 2026-03-18 16:44:53.651666668 +0000 UTC m=+64.243607198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls") pod "image-registry-64977ff748-pw6tv" (UID: "491e0153-8caa-4407-b31c-f8618b35079b") : secret "image-registry-tls" not found Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:37.651607 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:37.651737 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:37.651751 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert podName:19412877-e39c-4727-8a7e-12bcaf2b8450 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:53.651733478 +0000 UTC m=+64.243674006 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-kqsnz" (UID: "19412877-e39c-4727-8a7e-12bcaf2b8450") : secret "networking-console-plugin-cert" not found Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:37.651788 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:37.651794 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls podName:81a2dba0-4d01-448c-accb-07510f0c8197 nodeName:}" failed. No retries permitted until 2026-03-18 16:44:53.651776317 +0000 UTC m=+64.243716846 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls") pod "dns-default-kqxqw" (UID: "81a2dba0-4d01-448c-accb-07510f0c8197") : secret "dns-default-metrics-tls" not found Mar 18 16:44:37.651993 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:37.651829 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert podName:f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c nodeName:}" failed. No retries permitted until 2026-03-18 16:44:53.651819346 +0000 UTC m=+64.243759874 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert") pod "ingress-canary-rzxp5" (UID: "f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c") : secret "canary-serving-cert" not found Mar 18 16:44:49.064472 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:49.064445 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9gjw2" Mar 18 16:44:53.667977 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:53.667942 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:53.668010 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:53.668032 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:53.668053 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:53.668092 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:53.668131 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:53.668134 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:53.668169 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:53.668186 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64977ff748-pw6tv: secret "image-registry-tls" not found Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:53.668174 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert podName:f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c nodeName:}" failed. No retries permitted until 2026-03-18 16:45:25.668153906 +0000 UTC m=+96.260094455 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert") pod "ingress-canary-rzxp5" (UID: "f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c") : secret "canary-serving-cert" not found Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:53.668241 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls podName:81a2dba0-4d01-448c-accb-07510f0c8197 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:25.668227202 +0000 UTC m=+96.260167743 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls") pod "dns-default-kqxqw" (UID: "81a2dba0-4d01-448c-accb-07510f0c8197") : secret "dns-default-metrics-tls" not found Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:53.668261 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert podName:19412877-e39c-4727-8a7e-12bcaf2b8450 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:25.668252456 +0000 UTC m=+96.260193017 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-kqsnz" (UID: "19412877-e39c-4727-8a7e-12bcaf2b8450") : secret "networking-console-plugin-cert" not found Mar 18 16:44:53.668320 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:53.668279 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls podName:491e0153-8caa-4407-b31c-f8618b35079b nodeName:}" failed. No retries permitted until 2026-03-18 16:45:25.668271896 +0000 UTC m=+96.260212424 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls") pod "image-registry-64977ff748-pw6tv" (UID: "491e0153-8caa-4407-b31c-f8618b35079b") : secret "image-registry-tls" not found Mar 18 16:44:55.681518 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:44:55.681446 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:44:55.681916 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:55.681599 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:44:55.681916 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:44:55.681661 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs podName:c0b2d8c5-09a4-472a-aad2-fa033be042f3 nodeName:}" failed. No retries permitted until 2026-03-18 16:45:59.681644777 +0000 UTC m=+130.273585321 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs") pod "network-metrics-daemon-kbxwz" (UID: "c0b2d8c5-09a4-472a-aad2-fa033be042f3") : secret "metrics-daemon-secret" not found Mar 18 16:45:00.098982 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:45:00.098955 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-s8rrk" Mar 18 16:45:25.695459 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:45:25.695420 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:45:25.695469 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:45:25.695490 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:45:25.695525 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:25.695566 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:25.695581 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:25.695597 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:25.695603 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64977ff748-pw6tv: secret "image-registry-tls" not found Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:25.695637 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert podName:19412877-e39c-4727-8a7e-12bcaf2b8450 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:29.695621963 +0000 UTC m=+160.287562492 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-kqsnz" (UID: "19412877-e39c-4727-8a7e-12bcaf2b8450") : secret "networking-console-plugin-cert" not found Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:25.695647 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:25.695660 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls podName:491e0153-8caa-4407-b31c-f8618b35079b nodeName:}" failed. No retries permitted until 2026-03-18 16:46:29.695643389 +0000 UTC m=+160.287583925 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls") pod "image-registry-64977ff748-pw6tv" (UID: "491e0153-8caa-4407-b31c-f8618b35079b") : secret "image-registry-tls" not found Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:25.695680 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert podName:f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c nodeName:}" failed. No retries permitted until 2026-03-18 16:46:29.695669247 +0000 UTC m=+160.287609789 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert") pod "ingress-canary-rzxp5" (UID: "f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c") : secret "canary-serving-cert" not found Mar 18 16:45:25.695935 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:25.695697 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls podName:81a2dba0-4d01-448c-accb-07510f0c8197 nodeName:}" failed. No retries permitted until 2026-03-18 16:46:29.695685938 +0000 UTC m=+160.287626473 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls") pod "dns-default-kqxqw" (UID: "81a2dba0-4d01-448c-accb-07510f0c8197") : secret "dns-default-metrics-tls" not found Mar 18 16:45:59.737697 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:45:59.737653 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:45:59.738213 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:59.737802 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 18 16:45:59.738213 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:45:59.737871 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs podName:c0b2d8c5-09a4-472a-aad2-fa033be042f3 nodeName:}" failed. No retries permitted until 2026-03-18 16:48:01.737854865 +0000 UTC m=+252.329795399 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs") pod "network-metrics-daemon-kbxwz" (UID: "c0b2d8c5-09a4-472a-aad2-fa033be042f3") : secret "metrics-daemon-secret" not found Mar 18 16:46:24.778110 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:24.778065 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-64977ff748-pw6tv" podUID="491e0153-8caa-4407-b31c-f8618b35079b" Mar 18 16:46:24.801866 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:24.801837 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" podUID="19412877-e39c-4727-8a7e-12bcaf2b8450" Mar 18 16:46:24.832104 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:24.832063 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-kqxqw" podUID="81a2dba0-4d01-448c-accb-07510f0c8197" Mar 18 16:46:24.850295 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:24.850272 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rzxp5" podUID="f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c" Mar 18 16:46:24.954159 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:24.954127 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-kbxwz" podUID="c0b2d8c5-09a4-472a-aad2-fa033be042f3" Mar 18 16:46:25.312031 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:25.311999 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kqxqw" Mar 18 16:46:25.312031 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:25.312028 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:46:29.743565 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:29.743528 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:29.743572 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") pod \"image-registry-64977ff748-pw6tv\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:29.743604 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:29.743637 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:29.743668 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:29.743716 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:29.743732 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64977ff748-pw6tv: secret "image-registry-tls" not found Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:29.743756 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert podName:19412877-e39c-4727-8a7e-12bcaf2b8450 nodeName:}" failed. No retries permitted until 2026-03-18 16:48:31.743733698 +0000 UTC m=+282.335674233 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-kqsnz" (UID: "19412877-e39c-4727-8a7e-12bcaf2b8450") : secret "networking-console-plugin-cert" not found Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:29.743758 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:29.743762 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:29.743775 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls podName:491e0153-8caa-4407-b31c-f8618b35079b nodeName:}" failed. No retries permitted until 2026-03-18 16:48:31.743764092 +0000 UTC m=+282.335704633 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls") pod "image-registry-64977ff748-pw6tv" (UID: "491e0153-8caa-4407-b31c-f8618b35079b") : secret "image-registry-tls" not found Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:29.743863 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert podName:f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c nodeName:}" failed. No retries permitted until 2026-03-18 16:48:31.743845716 +0000 UTC m=+282.335786248 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert") pod "ingress-canary-rzxp5" (UID: "f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c") : secret "canary-serving-cert" not found Mar 18 16:46:29.744008 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:29.743877 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls podName:81a2dba0-4d01-448c-accb-07510f0c8197 nodeName:}" failed. No retries permitted until 2026-03-18 16:48:31.74386908 +0000 UTC m=+282.335809610 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls") pod "dns-default-kqxqw" (UID: "81a2dba0-4d01-448c-accb-07510f0c8197") : secret "dns-default-metrics-tls" not found Mar 18 16:46:32.763517 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.763474 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-76b8565867-cbztj"] Mar 18 16:46:32.766089 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.766073 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.768207 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.768143 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 18 16:46:32.768207 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.768180 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 18 16:46:32.769521 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.769061 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 18 16:46:32.769624 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.769575 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:32.770490 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.769710 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-qmc47\"" Mar 18 16:46:32.776790 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.776769 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 18 16:46:32.811074 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.811050 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b8565867-cbztj"] Mar 18 16:46:32.866820 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.866790 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f07a196d-57d0-4b38-a848-ed2272802021-serving-cert\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.866923 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.866842 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmdtk\" (UniqueName: \"kubernetes.io/projected/f07a196d-57d0-4b38-a848-ed2272802021-kube-api-access-zmdtk\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.866923 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.866905 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07a196d-57d0-4b38-a848-ed2272802021-config\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.867028 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.866932 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f07a196d-57d0-4b38-a848-ed2272802021-trusted-ca\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.967732 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.967710 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07a196d-57d0-4b38-a848-ed2272802021-config\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.967827 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.967744 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f07a196d-57d0-4b38-a848-ed2272802021-trusted-ca\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.967879 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.967842 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f07a196d-57d0-4b38-a848-ed2272802021-serving-cert\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.967927 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.967880 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmdtk\" (UniqueName: \"kubernetes.io/projected/f07a196d-57d0-4b38-a848-ed2272802021-kube-api-access-zmdtk\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.968910 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.968891 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f07a196d-57d0-4b38-a848-ed2272802021-config\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.969039 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.969024 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f07a196d-57d0-4b38-a848-ed2272802021-trusted-ca\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.970990 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.970975 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f07a196d-57d0-4b38-a848-ed2272802021-serving-cert\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:32.974732 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:32.974701 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmdtk\" (UniqueName: \"kubernetes.io/projected/f07a196d-57d0-4b38-a848-ed2272802021-kube-api-access-zmdtk\") pod \"console-operator-76b8565867-cbztj\" (UID: \"f07a196d-57d0-4b38-a848-ed2272802021\") " pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:33.077959 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:33.077903 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:33.186848 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:33.186825 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b8565867-cbztj"] Mar 18 16:46:33.190311 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:46:33.190290 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf07a196d_57d0_4b38_a848_ed2272802021.slice/crio-148551fe7c8262af9208a83bccad7ef326dd86accee929458cef733c25393888 WatchSource:0}: Error finding container 148551fe7c8262af9208a83bccad7ef326dd86accee929458cef733c25393888: Status 404 returned error can't find the container with id 148551fe7c8262af9208a83bccad7ef326dd86accee929458cef733c25393888 Mar 18 16:46:33.326266 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:33.326233 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-cbztj" event={"ID":"f07a196d-57d0-4b38-a848-ed2272802021","Type":"ContainerStarted","Data":"148551fe7c8262af9208a83bccad7ef326dd86accee929458cef733c25393888"} Mar 18 16:46:35.332033 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:35.331962 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/0.log" Mar 18 16:46:35.332033 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:35.332005 2563 generic.go:358] "Generic (PLEG): container finished" podID="f07a196d-57d0-4b38-a848-ed2272802021" containerID="74dcf35885d568cec8a2376ed973b8506579851115fafe05ac248b39628a7093" exitCode=255 Mar 18 16:46:35.332419 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:35.332080 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-cbztj" event={"ID":"f07a196d-57d0-4b38-a848-ed2272802021","Type":"ContainerDied","Data":"74dcf35885d568cec8a2376ed973b8506579851115fafe05ac248b39628a7093"} Mar 18 16:46:35.332419 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:35.332267 2563 scope.go:117] "RemoveContainer" containerID="74dcf35885d568cec8a2376ed973b8506579851115fafe05ac248b39628a7093" Mar 18 16:46:35.936353 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:35.936315 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:46:36.335733 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:36.335654 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/1.log" Mar 18 16:46:36.336138 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:36.336100 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/0.log" Mar 18 16:46:36.336179 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:36.336135 2563 generic.go:358] "Generic (PLEG): container finished" podID="f07a196d-57d0-4b38-a848-ed2272802021" containerID="8900e19c172951649ced1d608c681de3d5341c72c7a13c1a5b224382f1407c33" exitCode=255 Mar 18 16:46:36.336213 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:36.336167 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-cbztj" event={"ID":"f07a196d-57d0-4b38-a848-ed2272802021","Type":"ContainerDied","Data":"8900e19c172951649ced1d608c681de3d5341c72c7a13c1a5b224382f1407c33"} Mar 18 16:46:36.336251 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:36.336212 2563 scope.go:117] "RemoveContainer" containerID="74dcf35885d568cec8a2376ed973b8506579851115fafe05ac248b39628a7093" Mar 18 16:46:36.336445 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:36.336423 2563 scope.go:117] "RemoveContainer" containerID="8900e19c172951649ced1d608c681de3d5341c72c7a13c1a5b224382f1407c33" Mar 18 16:46:36.336626 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:36.336607 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-cbztj_openshift-console-operator(f07a196d-57d0-4b38-a848-ed2272802021)\"" pod="openshift-console-operator/console-operator-76b8565867-cbztj" podUID="f07a196d-57d0-4b38-a848-ed2272802021" Mar 18 16:46:36.936946 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:36.936890 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:46:37.306714 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.306641 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt"] Mar 18 16:46:37.309651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.309636 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt" Mar 18 16:46:37.311595 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.311576 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Mar 18 16:46:37.312012 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.311994 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-lvl27\"" Mar 18 16:46:37.312101 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.312008 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Mar 18 16:46:37.315541 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.315519 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt"] Mar 18 16:46:37.339140 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.339123 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/1.log" Mar 18 16:46:37.339446 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.339432 2563 scope.go:117] "RemoveContainer" containerID="8900e19c172951649ced1d608c681de3d5341c72c7a13c1a5b224382f1407c33" Mar 18 16:46:37.339610 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:37.339596 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-cbztj_openshift-console-operator(f07a196d-57d0-4b38-a848-ed2272802021)\"" pod="openshift-console-operator/console-operator-76b8565867-cbztj" podUID="f07a196d-57d0-4b38-a848-ed2272802021" Mar 18 16:46:37.402849 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.402817 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzdhm\" (UniqueName: \"kubernetes.io/projected/e701bc5a-b789-4dcd-9e11-12dadc2022b2-kube-api-access-vzdhm\") pod \"migrator-6b589cdcc-676zt\" (UID: \"e701bc5a-b789-4dcd-9e11-12dadc2022b2\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt" Mar 18 16:46:37.503989 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.503959 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzdhm\" (UniqueName: \"kubernetes.io/projected/e701bc5a-b789-4dcd-9e11-12dadc2022b2-kube-api-access-vzdhm\") pod \"migrator-6b589cdcc-676zt\" (UID: \"e701bc5a-b789-4dcd-9e11-12dadc2022b2\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt" Mar 18 16:46:37.512219 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.512196 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzdhm\" (UniqueName: \"kubernetes.io/projected/e701bc5a-b789-4dcd-9e11-12dadc2022b2-kube-api-access-vzdhm\") pod \"migrator-6b589cdcc-676zt\" (UID: \"e701bc5a-b789-4dcd-9e11-12dadc2022b2\") " pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt" Mar 18 16:46:37.618665 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.618634 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt" Mar 18 16:46:37.730937 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:37.730907 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt"] Mar 18 16:46:37.734154 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:46:37.734126 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode701bc5a_b789_4dcd_9e11_12dadc2022b2.slice/crio-b3427cca1f0744091e979cddc879353815d22d07d61497347c74b2c4b73e3dd8 WatchSource:0}: Error finding container b3427cca1f0744091e979cddc879353815d22d07d61497347c74b2c4b73e3dd8: Status 404 returned error can't find the container with id b3427cca1f0744091e979cddc879353815d22d07d61497347c74b2c4b73e3dd8 Mar 18 16:46:38.341882 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:38.341847 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt" event={"ID":"e701bc5a-b789-4dcd-9e11-12dadc2022b2","Type":"ContainerStarted","Data":"b3427cca1f0744091e979cddc879353815d22d07d61497347c74b2c4b73e3dd8"} Mar 18 16:46:39.344873 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:39.344848 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt" event={"ID":"e701bc5a-b789-4dcd-9e11-12dadc2022b2","Type":"ContainerStarted","Data":"9d80a55db7503621f263bc608dbe2f93cfbaf1f59795ca92b25804009519f0bd"} Mar 18 16:46:39.344873 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:39.344879 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt" event={"ID":"e701bc5a-b789-4dcd-9e11-12dadc2022b2","Type":"ContainerStarted","Data":"4f92689c73ed463c3068533f8ff27d7f1e8b8b1aed036a98f11167f0572a55d9"} Mar 18 16:46:39.359292 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:39.359245 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-6b589cdcc-676zt" podStartSLOduration=0.939469625 podStartE2EDuration="2.359230623s" podCreationTimestamp="2026-03-18 16:46:37 +0000 UTC" firstStartedPulling="2026-03-18 16:46:37.735866746 +0000 UTC m=+168.327807276" lastFinishedPulling="2026-03-18 16:46:39.155627733 +0000 UTC m=+169.747568274" observedRunningTime="2026-03-18 16:46:39.358738781 +0000 UTC m=+169.950679329" watchObservedRunningTime="2026-03-18 16:46:39.359230623 +0000 UTC m=+169.951171173" Mar 18 16:46:39.938047 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:39.938020 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:46:40.207334 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:40.207260 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mlpnc_d2859c82-5f61-4503-bb69-2147a77ca895/dns-node-resolver/0.log" Mar 18 16:46:41.207585 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:41.207560 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-f9chd_4aac1f96-b5de-49d5-84a3-176806d103dc/node-ca/0.log" Mar 18 16:46:42.407258 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:42.407224 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-676zt_e701bc5a-b789-4dcd-9e11-12dadc2022b2/migrator/0.log" Mar 18 16:46:42.606884 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:42.606857 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-676zt_e701bc5a-b789-4dcd-9e11-12dadc2022b2/graceful-termination/0.log" Mar 18 16:46:43.078385 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:43.078354 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:43.078576 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:43.078434 2563 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:46:43.078738 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:43.078726 2563 scope.go:117] "RemoveContainer" containerID="8900e19c172951649ced1d608c681de3d5341c72c7a13c1a5b224382f1407c33" Mar 18 16:46:43.078930 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:43.078912 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-cbztj_openshift-console-operator(f07a196d-57d0-4b38-a848-ed2272802021)\"" pod="openshift-console-operator/console-operator-76b8565867-cbztj" podUID="f07a196d-57d0-4b38-a848-ed2272802021" Mar 18 16:46:43.352655 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:43.352586 2563 scope.go:117] "RemoveContainer" containerID="8900e19c172951649ced1d608c681de3d5341c72c7a13c1a5b224382f1407c33" Mar 18 16:46:43.352795 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:43.352751 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-cbztj_openshift-console-operator(f07a196d-57d0-4b38-a848-ed2272802021)\"" pod="openshift-console-operator/console-operator-76b8565867-cbztj" podUID="f07a196d-57d0-4b38-a848-ed2272802021" Mar 18 16:46:54.937366 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:54.937338 2563 scope.go:117] "RemoveContainer" containerID="8900e19c172951649ced1d608c681de3d5341c72c7a13c1a5b224382f1407c33" Mar 18 16:46:55.376246 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:55.376222 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 16:46:55.376634 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:55.376618 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/1.log" Mar 18 16:46:55.376691 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:55.376652 2563 generic.go:358] "Generic (PLEG): container finished" podID="f07a196d-57d0-4b38-a848-ed2272802021" containerID="c164a081f852878657669ac1fcc028c75235a5996e3b7725613b31a5c4a9af00" exitCode=255 Mar 18 16:46:55.376722 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:55.376693 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-cbztj" event={"ID":"f07a196d-57d0-4b38-a848-ed2272802021","Type":"ContainerDied","Data":"c164a081f852878657669ac1fcc028c75235a5996e3b7725613b31a5c4a9af00"} Mar 18 16:46:55.376754 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:55.376722 2563 scope.go:117] "RemoveContainer" containerID="8900e19c172951649ced1d608c681de3d5341c72c7a13c1a5b224382f1407c33" Mar 18 16:46:55.377020 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:55.377004 2563 scope.go:117] "RemoveContainer" containerID="c164a081f852878657669ac1fcc028c75235a5996e3b7725613b31a5c4a9af00" Mar 18 16:46:55.377190 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:46:55.377167 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b8565867-cbztj_openshift-console-operator(f07a196d-57d0-4b38-a848-ed2272802021)\"" pod="openshift-console-operator/console-operator-76b8565867-cbztj" podUID="f07a196d-57d0-4b38-a848-ed2272802021" Mar 18 16:46:56.379725 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:46:56.379699 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 16:47:01.280789 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.280752 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dhxqp"] Mar 18 16:47:01.283653 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.283637 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.286219 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.286188 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 18 16:47:01.286322 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.286191 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 18 16:47:01.286714 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.286696 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 18 16:47:01.286771 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.286698 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jqp6n\"" Mar 18 16:47:01.286817 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.286781 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 18 16:47:01.297554 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.297534 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dhxqp"] Mar 18 16:47:01.372559 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.372533 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b785a821-c366-4ed7-8772-a55ce63347cb-data-volume\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.372664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.372572 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b785a821-c366-4ed7-8772-a55ce63347cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.372664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.372629 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b785a821-c366-4ed7-8772-a55ce63347cb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.372664 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.372652 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b785a821-c366-4ed7-8772-a55ce63347cb-crio-socket\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.372790 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.372668 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl685\" (UniqueName: \"kubernetes.io/projected/b785a821-c366-4ed7-8772-a55ce63347cb-kube-api-access-cl685\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.473574 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.473549 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b785a821-c366-4ed7-8772-a55ce63347cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.473718 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.473604 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b785a821-c366-4ed7-8772-a55ce63347cb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.473718 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.473630 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b785a821-c366-4ed7-8772-a55ce63347cb-crio-socket\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.473718 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.473645 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cl685\" (UniqueName: \"kubernetes.io/projected/b785a821-c366-4ed7-8772-a55ce63347cb-kube-api-access-cl685\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.473718 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.473692 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b785a821-c366-4ed7-8772-a55ce63347cb-data-volume\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.473924 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.473767 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b785a821-c366-4ed7-8772-a55ce63347cb-crio-socket\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.474076 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.474060 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b785a821-c366-4ed7-8772-a55ce63347cb-data-volume\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.474259 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.474241 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b785a821-c366-4ed7-8772-a55ce63347cb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.475945 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.475924 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b785a821-c366-4ed7-8772-a55ce63347cb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.484185 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.484164 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl685\" (UniqueName: \"kubernetes.io/projected/b785a821-c366-4ed7-8772-a55ce63347cb-kube-api-access-cl685\") pod \"insights-runtime-extractor-dhxqp\" (UID: \"b785a821-c366-4ed7-8772-a55ce63347cb\") " pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.592633 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.592585 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dhxqp" Mar 18 16:47:01.707473 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:01.707446 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dhxqp"] Mar 18 16:47:01.710735 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:47:01.710706 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb785a821_c366_4ed7_8772_a55ce63347cb.slice/crio-20a378fc16b16e42e36c9d8b3d5ab144caab4d3f07e7645b898daab2895d0549 WatchSource:0}: Error finding container 20a378fc16b16e42e36c9d8b3d5ab144caab4d3f07e7645b898daab2895d0549: Status 404 returned error can't find the container with id 20a378fc16b16e42e36c9d8b3d5ab144caab4d3f07e7645b898daab2895d0549 Mar 18 16:47:02.392169 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:02.392139 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhxqp" event={"ID":"b785a821-c366-4ed7-8772-a55ce63347cb","Type":"ContainerStarted","Data":"c648256cf0079665def137c5bdd5cabf17d6731f522f39b8b4fc834f48a326a2"} Mar 18 16:47:02.392169 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:02.392171 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhxqp" event={"ID":"b785a821-c366-4ed7-8772-a55ce63347cb","Type":"ContainerStarted","Data":"20a378fc16b16e42e36c9d8b3d5ab144caab4d3f07e7645b898daab2895d0549"} Mar 18 16:47:03.078804 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:03.078772 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:47:03.078804 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:03.078803 2563 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:47:03.079092 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:03.079081 2563 scope.go:117] "RemoveContainer" containerID="c164a081f852878657669ac1fcc028c75235a5996e3b7725613b31a5c4a9af00" Mar 18 16:47:03.079240 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:47:03.079225 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b8565867-cbztj_openshift-console-operator(f07a196d-57d0-4b38-a848-ed2272802021)\"" pod="openshift-console-operator/console-operator-76b8565867-cbztj" podUID="f07a196d-57d0-4b38-a848-ed2272802021" Mar 18 16:47:03.396605 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:03.396574 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhxqp" event={"ID":"b785a821-c366-4ed7-8772-a55ce63347cb","Type":"ContainerStarted","Data":"54e350b46dd91adc0a1d9523f8d4aa7be4b747d099313d93cdd0d9ca40841276"} Mar 18 16:47:04.400217 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:04.400165 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dhxqp" event={"ID":"b785a821-c366-4ed7-8772-a55ce63347cb","Type":"ContainerStarted","Data":"9764e68c9741d9b063af91f115453e4211b12e100c0e5ffef906ac4609e8619d"} Mar 18 16:47:04.418260 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:04.418214 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dhxqp" podStartSLOduration=0.981689142 podStartE2EDuration="3.418196145s" podCreationTimestamp="2026-03-18 16:47:01 +0000 UTC" firstStartedPulling="2026-03-18 16:47:01.773081612 +0000 UTC m=+192.365022141" lastFinishedPulling="2026-03-18 16:47:04.209588605 +0000 UTC m=+194.801529144" observedRunningTime="2026-03-18 16:47:04.418167684 +0000 UTC m=+195.010108237" watchObservedRunningTime="2026-03-18 16:47:04.418196145 +0000 UTC m=+195.010136697" Mar 18 16:47:15.743698 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.743663 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn"] Mar 18 16:47:15.746822 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.746805 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.749080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.749057 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 18 16:47:15.749205 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.749057 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Mar 18 16:47:15.749642 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.749623 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 18 16:47:15.749642 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.749631 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Mar 18 16:47:15.749763 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.749656 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-v465r\"" Mar 18 16:47:15.750253 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.750240 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 18 16:47:15.766576 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.766554 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn"] Mar 18 16:47:15.778495 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.778471 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2rhhp"] Mar 18 16:47:15.781416 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.781398 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.783361 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.783343 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 18 16:47:15.784567 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.784550 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 18 16:47:15.784720 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.784706 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qqtwn\"" Mar 18 16:47:15.784766 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.784757 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 18 16:47:15.878205 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878182 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwqj4\" (UniqueName: \"kubernetes.io/projected/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-kube-api-access-qwqj4\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.878299 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878209 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.878299 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878243 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8rwn\" (UniqueName: \"kubernetes.io/projected/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-kube-api-access-t8rwn\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.878387 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878312 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-accelerators-collector-config\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.878387 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878370 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.878476 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878389 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.878553 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878487 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-sys\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.878553 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878540 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.878662 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878576 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-textfile\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.878662 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878617 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-root\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.878662 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878642 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-metrics-client-ca\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.878774 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878708 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-tls\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.878774 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.878729 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-wtmp\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.937197 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.937181 2563 scope.go:117] "RemoveContainer" containerID="c164a081f852878657669ac1fcc028c75235a5996e3b7725613b31a5c4a9af00" Mar 18 16:47:15.979690 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979666 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.979788 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979700 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.979788 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979750 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-sys\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.979788 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979782 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.979950 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979814 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-textfile\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.979950 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979848 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-sys\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.979950 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979854 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-root\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.979950 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979900 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-root\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.979950 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979901 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-metrics-client-ca\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.980188 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979960 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-tls\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.980188 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.979988 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-wtmp\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.980188 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.980035 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwqj4\" (UniqueName: \"kubernetes.io/projected/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-kube-api-access-qwqj4\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.980188 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.980067 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.980188 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.980108 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8rwn\" (UniqueName: \"kubernetes.io/projected/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-kube-api-access-t8rwn\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.980188 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.980143 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-accelerators-collector-config\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.980188 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.980146 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-wtmp\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.980557 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.980193 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-textfile\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.980557 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:47:15.980271 2563 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Mar 18 16:47:15.980557 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:47:15.980324 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-openshift-state-metrics-tls podName:03cfcb33-b824-4a9e-adb6-97fc3f3f59dc nodeName:}" failed. No retries permitted until 2026-03-18 16:47:16.480305834 +0000 UTC m=+207.072246366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-openshift-state-metrics-tls") pod "openshift-state-metrics-68b5d5d464-f9vqn" (UID: "03cfcb33-b824-4a9e-adb6-97fc3f3f59dc") : secret "openshift-state-metrics-tls" not found Mar 18 16:47:15.980557 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.980528 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-metrics-client-ca\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.980755 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.980617 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.980755 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:47:15.980645 2563 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 18 16:47:15.980755 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:47:15.980714 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-tls podName:ffc04bd0-9cc1-48d8-a749-36263f3c9b5a nodeName:}" failed. No retries permitted until 2026-03-18 16:47:16.480696543 +0000 UTC m=+207.072637075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-tls") pod "node-exporter-2rhhp" (UID: "ffc04bd0-9cc1-48d8-a749-36263f3c9b5a") : secret "node-exporter-tls" not found Mar 18 16:47:15.980755 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.980733 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-accelerators-collector-config\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.982002 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.981976 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.982086 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.982066 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:15.997360 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.997313 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwqj4\" (UniqueName: \"kubernetes.io/projected/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-kube-api-access-qwqj4\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:15.997994 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:15.997975 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8rwn\" (UniqueName: \"kubernetes.io/projected/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-kube-api-access-t8rwn\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:16.430904 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.430874 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 16:47:16.431079 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.430930 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-cbztj" event={"ID":"f07a196d-57d0-4b38-a848-ed2272802021","Type":"ContainerStarted","Data":"a5efcd15c665c27503d41d1f4c85e711f2c45ccacb7dd680abd1a45360c60197"} Mar 18 16:47:16.431205 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.431186 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:47:16.436148 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.436123 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-76b8565867-cbztj" Mar 18 16:47:16.451165 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.451125 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-76b8565867-cbztj" podStartSLOduration=42.638243749 podStartE2EDuration="44.451113821s" podCreationTimestamp="2026-03-18 16:46:32 +0000 UTC" firstStartedPulling="2026-03-18 16:46:33.192059963 +0000 UTC m=+163.784000493" lastFinishedPulling="2026-03-18 16:46:35.004930033 +0000 UTC m=+165.596870565" observedRunningTime="2026-03-18 16:47:16.450824618 +0000 UTC m=+207.042765190" watchObservedRunningTime="2026-03-18 16:47:16.451113821 +0000 UTC m=+207.043054378" Mar 18 16:47:16.467478 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.467453 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-5b85974fd6-rkfll"] Mar 18 16:47:16.472034 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.472015 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-5b85974fd6-rkfll" Mar 18 16:47:16.473823 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.473804 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 18 16:47:16.473948 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.473834 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-vpljl\"" Mar 18 16:47:16.473948 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.473854 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 18 16:47:16.481198 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.481165 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-5b85974fd6-rkfll"] Mar 18 16:47:16.485764 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.485745 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-tls\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:16.485855 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.485785 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:16.488135 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.488116 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03cfcb33-b824-4a9e-adb6-97fc3f3f59dc-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-f9vqn\" (UID: \"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:16.488250 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.488231 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ffc04bd0-9cc1-48d8-a749-36263f3c9b5a-node-exporter-tls\") pod \"node-exporter-2rhhp\" (UID: \"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a\") " pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:16.586406 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.586383 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvdgw\" (UniqueName: \"kubernetes.io/projected/f544a96a-2222-4cef-9b87-65de64826ccb-kube-api-access-zvdgw\") pod \"downloads-5b85974fd6-rkfll\" (UID: \"f544a96a-2222-4cef-9b87-65de64826ccb\") " pod="openshift-console/downloads-5b85974fd6-rkfll" Mar 18 16:47:16.656651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.656631 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" Mar 18 16:47:16.687283 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.687232 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdgw\" (UniqueName: \"kubernetes.io/projected/f544a96a-2222-4cef-9b87-65de64826ccb-kube-api-access-zvdgw\") pod \"downloads-5b85974fd6-rkfll\" (UID: \"f544a96a-2222-4cef-9b87-65de64826ccb\") " pod="openshift-console/downloads-5b85974fd6-rkfll" Mar 18 16:47:16.690239 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.690223 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2rhhp" Mar 18 16:47:16.696150 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.696130 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvdgw\" (UniqueName: \"kubernetes.io/projected/f544a96a-2222-4cef-9b87-65de64826ccb-kube-api-access-zvdgw\") pod \"downloads-5b85974fd6-rkfll\" (UID: \"f544a96a-2222-4cef-9b87-65de64826ccb\") " pod="openshift-console/downloads-5b85974fd6-rkfll" Mar 18 16:47:16.698050 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:47:16.698025 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffc04bd0_9cc1_48d8_a749_36263f3c9b5a.slice/crio-509609fc07e38c71e4097f95c3fe3ca702a78e09df6a610b72b228470c79fb3c WatchSource:0}: Error finding container 509609fc07e38c71e4097f95c3fe3ca702a78e09df6a610b72b228470c79fb3c: Status 404 returned error can't find the container with id 509609fc07e38c71e4097f95c3fe3ca702a78e09df6a610b72b228470c79fb3c Mar 18 16:47:16.772158 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.772125 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn"] Mar 18 16:47:16.774684 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:47:16.774652 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03cfcb33_b824_4a9e_adb6_97fc3f3f59dc.slice/crio-2f8130db89263d3696d0bba3ca9ce2798e1fc21ae7ef2b4eafd064cb450c4650 WatchSource:0}: Error finding container 2f8130db89263d3696d0bba3ca9ce2798e1fc21ae7ef2b4eafd064cb450c4650: Status 404 returned error can't find the container with id 2f8130db89263d3696d0bba3ca9ce2798e1fc21ae7ef2b4eafd064cb450c4650 Mar 18 16:47:16.782103 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.782086 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-5b85974fd6-rkfll" Mar 18 16:47:16.918482 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:16.918459 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-5b85974fd6-rkfll"] Mar 18 16:47:16.920790 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:47:16.920767 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf544a96a_2222_4cef_9b87_65de64826ccb.slice/crio-be30262bffc6c4ddc56bec3aec879d25eafd04e297b84688e8b94b82755abef0 WatchSource:0}: Error finding container be30262bffc6c4ddc56bec3aec879d25eafd04e297b84688e8b94b82755abef0: Status 404 returned error can't find the container with id be30262bffc6c4ddc56bec3aec879d25eafd04e297b84688e8b94b82755abef0 Mar 18 16:47:17.434550 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.434489 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-5b85974fd6-rkfll" event={"ID":"f544a96a-2222-4cef-9b87-65de64826ccb","Type":"ContainerStarted","Data":"be30262bffc6c4ddc56bec3aec879d25eafd04e297b84688e8b94b82755abef0"} Mar 18 16:47:17.435671 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.435631 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2rhhp" event={"ID":"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a","Type":"ContainerStarted","Data":"509609fc07e38c71e4097f95c3fe3ca702a78e09df6a610b72b228470c79fb3c"} Mar 18 16:47:17.437135 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.437106 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" event={"ID":"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc","Type":"ContainerStarted","Data":"6fbd660da448878878923d3bc1facfe4b0035e25db9d0a5ed979257d9ebc6465"} Mar 18 16:47:17.437135 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.437136 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" event={"ID":"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc","Type":"ContainerStarted","Data":"2237070b81eea9ce387a51e992253772102dd2138f6743ac460bbb7905ce081f"} Mar 18 16:47:17.437258 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.437146 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" event={"ID":"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc","Type":"ContainerStarted","Data":"2f8130db89263d3696d0bba3ca9ce2798e1fc21ae7ef2b4eafd064cb450c4650"} Mar 18 16:47:17.778527 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.778428 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-746897889c-jk4jx"] Mar 18 16:47:17.782200 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.782163 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:17.784737 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.784712 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Mar 18 16:47:17.784870 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.784743 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Mar 18 16:47:17.785124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.785011 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Mar 18 16:47:17.785124 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.785112 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-56ot17uc5l1av\"" Mar 18 16:47:17.785262 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.785027 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Mar 18 16:47:17.785567 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.785541 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Mar 18 16:47:17.785634 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.785569 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-9fnws\"" Mar 18 16:47:17.797589 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.797566 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-746897889c-jk4jx"] Mar 18 16:47:17.899201 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.899166 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:17.899372 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.899286 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-grpc-tls\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:17.899372 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.899332 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-tls\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:17.899497 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.899419 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-metrics-client-ca\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:17.899497 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.899454 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:17.899497 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.899472 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:17.899497 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.899489 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:17.899721 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:17.899585 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwnn8\" (UniqueName: \"kubernetes.io/projected/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-kube-api-access-fwnn8\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.000129 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.000095 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-metrics-client-ca\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.000312 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.000145 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.000312 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.000168 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.000312 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.000187 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.000312 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.000243 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwnn8\" (UniqueName: \"kubernetes.io/projected/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-kube-api-access-fwnn8\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.000312 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.000283 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.000573 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.000496 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-grpc-tls\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.000573 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.000557 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-tls\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.000878 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.000809 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-metrics-client-ca\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.003231 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.003203 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.003457 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.003413 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.003727 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.003686 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.003952 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.003933 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.004181 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.004164 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-thanos-querier-tls\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.004267 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.004220 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-secret-grpc-tls\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.008333 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.008312 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwnn8\" (UniqueName: \"kubernetes.io/projected/4f60fdfc-0b54-4fb2-9481-e8f451e1ed37-kube-api-access-fwnn8\") pod \"thanos-querier-746897889c-jk4jx\" (UID: \"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37\") " pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.094940 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.094920 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:18.231748 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.231720 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-746897889c-jk4jx"] Mar 18 16:47:18.234957 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:47:18.234932 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f60fdfc_0b54_4fb2_9481_e8f451e1ed37.slice/crio-17937305c9164675c67d8f03c3860cc4000decd4cbfd94483e6daec971b7c559 WatchSource:0}: Error finding container 17937305c9164675c67d8f03c3860cc4000decd4cbfd94483e6daec971b7c559: Status 404 returned error can't find the container with id 17937305c9164675c67d8f03c3860cc4000decd4cbfd94483e6daec971b7c559 Mar 18 16:47:18.441683 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.441652 2563 generic.go:358] "Generic (PLEG): container finished" podID="ffc04bd0-9cc1-48d8-a749-36263f3c9b5a" containerID="852bd7c4cf21fc6cfeaa270a8233db9ca7c5883cbeefef5f4b2b1729af356629" exitCode=0 Mar 18 16:47:18.441848 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.441736 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2rhhp" event={"ID":"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a","Type":"ContainerDied","Data":"852bd7c4cf21fc6cfeaa270a8233db9ca7c5883cbeefef5f4b2b1729af356629"} Mar 18 16:47:18.442778 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.442756 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" event={"ID":"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37","Type":"ContainerStarted","Data":"17937305c9164675c67d8f03c3860cc4000decd4cbfd94483e6daec971b7c559"} Mar 18 16:47:18.444573 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.444549 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" event={"ID":"03cfcb33-b824-4a9e-adb6-97fc3f3f59dc","Type":"ContainerStarted","Data":"ceb5f11607102e3ab6ca2a71697bc053a9484ad4cd5137fd4391d9562222d013"} Mar 18 16:47:18.521818 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:18.521782 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-f9vqn" podStartSLOduration=2.386461638 podStartE2EDuration="3.521767713s" podCreationTimestamp="2026-03-18 16:47:15 +0000 UTC" firstStartedPulling="2026-03-18 16:47:16.910092706 +0000 UTC m=+207.502033236" lastFinishedPulling="2026-03-18 16:47:18.045398771 +0000 UTC m=+208.637339311" observedRunningTime="2026-03-18 16:47:18.514998125 +0000 UTC m=+209.106938675" watchObservedRunningTime="2026-03-18 16:47:18.521767713 +0000 UTC m=+209.113708263" Mar 18 16:47:19.450279 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:19.450234 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2rhhp" event={"ID":"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a","Type":"ContainerStarted","Data":"544301768a34907415622a1be8ef7119a2753bd61332411014331d3be62ba2c5"} Mar 18 16:47:19.450279 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:19.450280 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2rhhp" event={"ID":"ffc04bd0-9cc1-48d8-a749-36263f3c9b5a","Type":"ContainerStarted","Data":"56321bcb4788201e4b914b70d85e3ebff549606b4194895b83910c282adf958e"} Mar 18 16:47:19.508403 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:19.508345 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2rhhp" podStartSLOduration=3.7598100690000003 podStartE2EDuration="4.508327299s" podCreationTimestamp="2026-03-18 16:47:15 +0000 UTC" firstStartedPulling="2026-03-18 16:47:16.699760765 +0000 UTC m=+207.291701294" lastFinishedPulling="2026-03-18 16:47:17.448277991 +0000 UTC m=+208.040218524" observedRunningTime="2026-03-18 16:47:19.506315527 +0000 UTC m=+210.098256079" watchObservedRunningTime="2026-03-18 16:47:19.508327299 +0000 UTC m=+210.100267851" Mar 18 16:47:21.458986 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:21.458955 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" event={"ID":"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37","Type":"ContainerStarted","Data":"badc15e4726084697ab64ee0f8a0a5f830282c60b9a44832233cc65fd589259f"} Mar 18 16:47:21.459375 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:21.458994 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" event={"ID":"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37","Type":"ContainerStarted","Data":"4063b5d4b28a16b1a0630af129928061c31a6f9ff54cfa288ad37ed69b9b32d4"} Mar 18 16:47:21.459375 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:21.459004 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" event={"ID":"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37","Type":"ContainerStarted","Data":"61955f6225636c5258c54820b5b786b0d12cae8a5f19caade781ef33958500e7"} Mar 18 16:47:22.464323 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:22.464275 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" event={"ID":"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37","Type":"ContainerStarted","Data":"f698db489edc47ba5bac8f27617f13fce4e9707bf891fed3871edd6ca0daf885"} Mar 18 16:47:22.464323 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:22.464322 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" event={"ID":"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37","Type":"ContainerStarted","Data":"9bb6dc741030a9e7563fe31e78a1e1ec6929270a9e8e205f7756d51c94fe1311"} Mar 18 16:47:22.464774 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:22.464336 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" event={"ID":"4f60fdfc-0b54-4fb2-9481-e8f451e1ed37","Type":"ContainerStarted","Data":"de4fbbb1ee15da857dba0bc7280da8e8d0a17583f52f393c3cd64f4d4b95a192"} Mar 18 16:47:22.464774 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:22.464529 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:22.509564 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:22.509516 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" podStartSLOduration=2.236275827 podStartE2EDuration="5.509481573s" podCreationTimestamp="2026-03-18 16:47:17 +0000 UTC" firstStartedPulling="2026-03-18 16:47:18.237051739 +0000 UTC m=+208.828992272" lastFinishedPulling="2026-03-18 16:47:21.510257484 +0000 UTC m=+212.102198018" observedRunningTime="2026-03-18 16:47:22.508126463 +0000 UTC m=+213.100067014" watchObservedRunningTime="2026-03-18 16:47:22.509481573 +0000 UTC m=+213.101422125" Mar 18 16:47:23.210000 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.209961 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64977ff748-pw6tv"] Mar 18 16:47:23.210227 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:47:23.210202 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-64977ff748-pw6tv" podUID="491e0153-8caa-4407-b31c-f8618b35079b" Mar 18 16:47:23.467031 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.466961 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:47:23.472447 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.472424 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:47:23.548237 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.548205 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjptd\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-kube-api-access-fjptd\") pod \"491e0153-8caa-4407-b31c-f8618b35079b\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " Mar 18 16:47:23.548346 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.548249 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/491e0153-8caa-4407-b31c-f8618b35079b-ca-trust-extracted\") pod \"491e0153-8caa-4407-b31c-f8618b35079b\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " Mar 18 16:47:23.548346 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.548280 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-registry-certificates\") pod \"491e0153-8caa-4407-b31c-f8618b35079b\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " Mar 18 16:47:23.548463 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.548389 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-installation-pull-secrets\") pod \"491e0153-8caa-4407-b31c-f8618b35079b\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " Mar 18 16:47:23.548463 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.548420 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-bound-sa-token\") pod \"491e0153-8caa-4407-b31c-f8618b35079b\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " Mar 18 16:47:23.548602 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.548575 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/491e0153-8caa-4407-b31c-f8618b35079b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "491e0153-8caa-4407-b31c-f8618b35079b" (UID: "491e0153-8caa-4407-b31c-f8618b35079b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 18 16:47:23.548670 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.548654 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "491e0153-8caa-4407-b31c-f8618b35079b" (UID: "491e0153-8caa-4407-b31c-f8618b35079b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:23.548879 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.548858 2563 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/491e0153-8caa-4407-b31c-f8618b35079b-ca-trust-extracted\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:47:23.549122 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.549102 2563 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-registry-certificates\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:47:23.550631 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.550605 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-kube-api-access-fjptd" (OuterVolumeSpecName: "kube-api-access-fjptd") pod "491e0153-8caa-4407-b31c-f8618b35079b" (UID: "491e0153-8caa-4407-b31c-f8618b35079b"). InnerVolumeSpecName "kube-api-access-fjptd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:23.550745 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.550685 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "491e0153-8caa-4407-b31c-f8618b35079b" (UID: "491e0153-8caa-4407-b31c-f8618b35079b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:47:23.550745 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.550710 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "491e0153-8caa-4407-b31c-f8618b35079b" (UID: "491e0153-8caa-4407-b31c-f8618b35079b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:23.649329 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.649307 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-trusted-ca\") pod \"491e0153-8caa-4407-b31c-f8618b35079b\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " Mar 18 16:47:23.649439 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.649397 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-image-registry-private-configuration\") pod \"491e0153-8caa-4407-b31c-f8618b35079b\" (UID: \"491e0153-8caa-4407-b31c-f8618b35079b\") " Mar 18 16:47:23.649626 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.649592 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fjptd\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-kube-api-access-fjptd\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:47:23.649626 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.649608 2563 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-installation-pull-secrets\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:47:23.649626 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.649618 2563 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-bound-sa-token\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:47:23.649793 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.649701 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "491e0153-8caa-4407-b31c-f8618b35079b" (UID: "491e0153-8caa-4407-b31c-f8618b35079b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:47:23.651409 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.651375 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "491e0153-8caa-4407-b31c-f8618b35079b" (UID: "491e0153-8caa-4407-b31c-f8618b35079b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:47:23.750003 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.749938 2563 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/491e0153-8caa-4407-b31c-f8618b35079b-image-registry-private-configuration\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:47:23.750003 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:23.749963 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/491e0153-8caa-4407-b31c-f8618b35079b-trusted-ca\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:47:24.470115 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:24.470080 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64977ff748-pw6tv" Mar 18 16:47:24.505171 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:24.505147 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64977ff748-pw6tv"] Mar 18 16:47:24.509803 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:24.509777 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-64977ff748-pw6tv"] Mar 18 16:47:24.555729 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:24.555706 2563 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/491e0153-8caa-4407-b31c-f8618b35079b-registry-tls\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:47:25.941985 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:25.941948 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="491e0153-8caa-4407-b31c-f8618b35079b" path="/var/lib/kubelet/pods/491e0153-8caa-4407-b31c-f8618b35079b/volumes" Mar 18 16:47:26.410651 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.410619 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65cfcc5956-x5sd8"] Mar 18 16:47:26.415270 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.415250 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.418252 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.418059 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pbg6k\"" Mar 18 16:47:26.418252 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.418070 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 18 16:47:26.418252 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.418119 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 18 16:47:26.418252 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.418144 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 18 16:47:26.418252 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.418144 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 18 16:47:26.418252 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.418194 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 18 16:47:26.422429 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.422391 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65cfcc5956-x5sd8"] Mar 18 16:47:26.471295 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.471271 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-service-ca\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.471401 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.471317 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-serving-cert\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.471449 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.471403 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-oauth-config\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.471449 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.471435 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-console-config\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.471554 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.471455 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-oauth-serving-cert\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.471554 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.471474 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhxf7\" (UniqueName: \"kubernetes.io/projected/dc29a765-450c-4686-8b12-4a4977337e9f-kube-api-access-rhxf7\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.572384 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.572355 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-oauth-config\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.572547 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.572404 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-console-config\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.572547 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.572441 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-oauth-serving-cert\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.572547 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.572474 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxf7\" (UniqueName: \"kubernetes.io/projected/dc29a765-450c-4686-8b12-4a4977337e9f-kube-api-access-rhxf7\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.572547 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.572528 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-service-ca\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.572762 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.572734 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-serving-cert\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.573260 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.573210 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-console-config\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.573377 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.573267 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-service-ca\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.573377 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.573283 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-oauth-serving-cert\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.575210 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.575171 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-oauth-config\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.575320 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.575275 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-serving-cert\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.583854 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.583817 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhxf7\" (UniqueName: \"kubernetes.io/projected/dc29a765-450c-4686-8b12-4a4977337e9f-kube-api-access-rhxf7\") pod \"console-65cfcc5956-x5sd8\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.727370 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.727300 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:26.858931 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:26.858900 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65cfcc5956-x5sd8"] Mar 18 16:47:26.862317 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:47:26.862292 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc29a765_450c_4686_8b12_4a4977337e9f.slice/crio-3b113aa692657f7eddfc37c856c7b949f5f4cb98dbde3740eb8cdf4fc5e6db4c WatchSource:0}: Error finding container 3b113aa692657f7eddfc37c856c7b949f5f4cb98dbde3740eb8cdf4fc5e6db4c: Status 404 returned error can't find the container with id 3b113aa692657f7eddfc37c856c7b949f5f4cb98dbde3740eb8cdf4fc5e6db4c Mar 18 16:47:27.479303 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:27.479272 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65cfcc5956-x5sd8" event={"ID":"dc29a765-450c-4686-8b12-4a4977337e9f","Type":"ContainerStarted","Data":"3b113aa692657f7eddfc37c856c7b949f5f4cb98dbde3740eb8cdf4fc5e6db4c"} Mar 18 16:47:28.476647 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:28.476614 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-746897889c-jk4jx" Mar 18 16:47:36.507521 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:36.507472 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-5b85974fd6-rkfll" event={"ID":"f544a96a-2222-4cef-9b87-65de64826ccb","Type":"ContainerStarted","Data":"05fc0c9e18a14a0fa03ed62dfc3bafb0aaa26cfdff2657ada83e1c40ced5f36a"} Mar 18 16:47:36.507950 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:36.507772 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-5b85974fd6-rkfll" Mar 18 16:47:36.509262 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:36.509234 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65cfcc5956-x5sd8" event={"ID":"dc29a765-450c-4686-8b12-4a4977337e9f","Type":"ContainerStarted","Data":"2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9"} Mar 18 16:47:36.513032 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:36.513009 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-5b85974fd6-rkfll" Mar 18 16:47:36.525270 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:36.525217 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-5b85974fd6-rkfll" podStartSLOduration=1.621937704 podStartE2EDuration="20.525202144s" podCreationTimestamp="2026-03-18 16:47:16 +0000 UTC" firstStartedPulling="2026-03-18 16:47:16.922744714 +0000 UTC m=+207.514685243" lastFinishedPulling="2026-03-18 16:47:35.826009136 +0000 UTC m=+226.417949683" observedRunningTime="2026-03-18 16:47:36.523935008 +0000 UTC m=+227.115875588" watchObservedRunningTime="2026-03-18 16:47:36.525202144 +0000 UTC m=+227.117142696" Mar 18 16:47:36.540294 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:36.540253 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65cfcc5956-x5sd8" podStartSLOduration=1.61231206 podStartE2EDuration="10.540243421s" podCreationTimestamp="2026-03-18 16:47:26 +0000 UTC" firstStartedPulling="2026-03-18 16:47:26.864379 +0000 UTC m=+217.456319530" lastFinishedPulling="2026-03-18 16:47:35.792310353 +0000 UTC m=+226.384250891" observedRunningTime="2026-03-18 16:47:36.539966482 +0000 UTC m=+227.131907036" watchObservedRunningTime="2026-03-18 16:47:36.540243421 +0000 UTC m=+227.132183972" Mar 18 16:47:36.728039 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:36.727998 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:36.728221 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:36.728054 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:36.734194 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:36.734168 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:37.517561 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:37.517488 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:47:46.248847 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:47:46.248814 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65cfcc5956-x5sd8"] Mar 18 16:48:01.762103 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:01.762070 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:48:01.764287 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:01.764269 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0b2d8c5-09a4-472a-aad2-fa033be042f3-metrics-certs\") pod \"network-metrics-daemon-kbxwz\" (UID: \"c0b2d8c5-09a4-472a-aad2-fa033be042f3\") " pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:48:01.839797 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:01.839771 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrjl4\"" Mar 18 16:48:01.848241 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:01.848222 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbxwz" Mar 18 16:48:01.962763 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:01.959854 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kbxwz"] Mar 18 16:48:01.979429 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:48:01.979401 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0b2d8c5_09a4_472a_aad2_fa033be042f3.slice/crio-ca2c209049761fb0a39d09938541ec72503cd1b4fac190403a74c700daf22415 WatchSource:0}: Error finding container ca2c209049761fb0a39d09938541ec72503cd1b4fac190403a74c700daf22415: Status 404 returned error can't find the container with id ca2c209049761fb0a39d09938541ec72503cd1b4fac190403a74c700daf22415 Mar 18 16:48:02.577916 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:02.577872 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbxwz" event={"ID":"c0b2d8c5-09a4-472a-aad2-fa033be042f3","Type":"ContainerStarted","Data":"ca2c209049761fb0a39d09938541ec72503cd1b4fac190403a74c700daf22415"} Mar 18 16:48:04.584062 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:04.584020 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbxwz" event={"ID":"c0b2d8c5-09a4-472a-aad2-fa033be042f3","Type":"ContainerStarted","Data":"553b62d4c02956ad7c64bd3a6ecbbbf787e6161a15171c6ed0d9569e25834bc8"} Mar 18 16:48:04.584062 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:04.584066 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbxwz" event={"ID":"c0b2d8c5-09a4-472a-aad2-fa033be042f3","Type":"ContainerStarted","Data":"60c705420fe220dfea03bcbbed9e16a6a9361905294e92c4541f98d686609799"} Mar 18 16:48:04.600979 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:04.600927 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kbxwz" podStartSLOduration=253.028345014 podStartE2EDuration="4m14.600912138s" podCreationTimestamp="2026-03-18 16:43:50 +0000 UTC" firstStartedPulling="2026-03-18 16:48:01.981263764 +0000 UTC m=+252.573204295" lastFinishedPulling="2026-03-18 16:48:03.553830877 +0000 UTC m=+254.145771419" observedRunningTime="2026-03-18 16:48:04.600139844 +0000 UTC m=+255.192080396" watchObservedRunningTime="2026-03-18 16:48:04.600912138 +0000 UTC m=+255.192852742" Mar 18 16:48:11.267669 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.267553 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65cfcc5956-x5sd8" podUID="dc29a765-450c-4686-8b12-4a4977337e9f" containerName="console" containerID="cri-o://2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9" gracePeriod=15 Mar 18 16:48:11.493413 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.493393 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65cfcc5956-x5sd8_dc29a765-450c-4686-8b12-4a4977337e9f/console/0.log" Mar 18 16:48:11.493536 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.493451 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:48:11.605279 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.605216 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65cfcc5956-x5sd8_dc29a765-450c-4686-8b12-4a4977337e9f/console/0.log" Mar 18 16:48:11.605279 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.605264 2563 generic.go:358] "Generic (PLEG): container finished" podID="dc29a765-450c-4686-8b12-4a4977337e9f" containerID="2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9" exitCode=2 Mar 18 16:48:11.605475 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.605320 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65cfcc5956-x5sd8" event={"ID":"dc29a765-450c-4686-8b12-4a4977337e9f","Type":"ContainerDied","Data":"2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9"} Mar 18 16:48:11.605475 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.605327 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65cfcc5956-x5sd8" Mar 18 16:48:11.605475 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.605348 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65cfcc5956-x5sd8" event={"ID":"dc29a765-450c-4686-8b12-4a4977337e9f","Type":"ContainerDied","Data":"3b113aa692657f7eddfc37c856c7b949f5f4cb98dbde3740eb8cdf4fc5e6db4c"} Mar 18 16:48:11.605475 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.605370 2563 scope.go:117] "RemoveContainer" containerID="2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9" Mar 18 16:48:11.613437 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.613413 2563 scope.go:117] "RemoveContainer" containerID="2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9" Mar 18 16:48:11.613728 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:48:11.613707 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9\": container with ID starting with 2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9 not found: ID does not exist" containerID="2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9" Mar 18 16:48:11.613796 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.613735 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9"} err="failed to get container status \"2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9\": rpc error: code = NotFound desc = could not find container \"2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9\": container with ID starting with 2c3d7984e46f5559b8de6740821e21dd38c1f9aaa566ba6262c56295e0375ab9 not found: ID does not exist" Mar 18 16:48:11.631556 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.631527 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-oauth-serving-cert\") pod \"dc29a765-450c-4686-8b12-4a4977337e9f\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " Mar 18 16:48:11.631648 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.631606 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-console-config\") pod \"dc29a765-450c-4686-8b12-4a4977337e9f\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " Mar 18 16:48:11.631648 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.631634 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-oauth-config\") pod \"dc29a765-450c-4686-8b12-4a4977337e9f\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " Mar 18 16:48:11.631744 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.631661 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-service-ca\") pod \"dc29a765-450c-4686-8b12-4a4977337e9f\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " Mar 18 16:48:11.631744 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.631689 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhxf7\" (UniqueName: \"kubernetes.io/projected/dc29a765-450c-4686-8b12-4a4977337e9f-kube-api-access-rhxf7\") pod \"dc29a765-450c-4686-8b12-4a4977337e9f\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " Mar 18 16:48:11.631744 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.631709 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-serving-cert\") pod \"dc29a765-450c-4686-8b12-4a4977337e9f\" (UID: \"dc29a765-450c-4686-8b12-4a4977337e9f\") " Mar 18 16:48:11.631909 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.631883 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dc29a765-450c-4686-8b12-4a4977337e9f" (UID: "dc29a765-450c-4686-8b12-4a4977337e9f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:11.631996 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.631977 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-console-config" (OuterVolumeSpecName: "console-config") pod "dc29a765-450c-4686-8b12-4a4977337e9f" (UID: "dc29a765-450c-4686-8b12-4a4977337e9f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:11.632175 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.632155 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-service-ca" (OuterVolumeSpecName: "service-ca") pod "dc29a765-450c-4686-8b12-4a4977337e9f" (UID: "dc29a765-450c-4686-8b12-4a4977337e9f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 18 16:48:11.633755 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.633725 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dc29a765-450c-4686-8b12-4a4977337e9f" (UID: "dc29a765-450c-4686-8b12-4a4977337e9f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:11.633846 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.633765 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dc29a765-450c-4686-8b12-4a4977337e9f" (UID: "dc29a765-450c-4686-8b12-4a4977337e9f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 18 16:48:11.633898 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.633827 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc29a765-450c-4686-8b12-4a4977337e9f-kube-api-access-rhxf7" (OuterVolumeSpecName: "kube-api-access-rhxf7") pod "dc29a765-450c-4686-8b12-4a4977337e9f" (UID: "dc29a765-450c-4686-8b12-4a4977337e9f"). InnerVolumeSpecName "kube-api-access-rhxf7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 18 16:48:11.732583 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.732561 2563 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-oauth-config\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:48:11.732583 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.732581 2563 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-service-ca\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:48:11.732697 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.732590 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rhxf7\" (UniqueName: \"kubernetes.io/projected/dc29a765-450c-4686-8b12-4a4977337e9f-kube-api-access-rhxf7\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:48:11.732697 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.732599 2563 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dc29a765-450c-4686-8b12-4a4977337e9f-console-serving-cert\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:48:11.732697 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.732608 2563 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-oauth-serving-cert\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:48:11.732697 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.732618 2563 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dc29a765-450c-4686-8b12-4a4977337e9f-console-config\") on node \"ip-10-0-141-231.ec2.internal\" DevicePath \"\"" Mar 18 16:48:11.925567 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.925541 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65cfcc5956-x5sd8"] Mar 18 16:48:11.928990 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.928970 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65cfcc5956-x5sd8"] Mar 18 16:48:11.939750 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:11.939730 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc29a765-450c-4686-8b12-4a4977337e9f" path="/var/lib/kubelet/pods/dc29a765-450c-4686-8b12-4a4977337e9f/volumes" Mar 18 16:48:28.313125 ip-10-0-141-231 kubenswrapper[2563]: E0318 16:48:28.313080 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-kqxqw" podUID="81a2dba0-4d01-448c-accb-07510f0c8197" Mar 18 16:48:28.649648 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:28.649622 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kqxqw" Mar 18 16:48:31.775310 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.775251 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:48:31.775310 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.775312 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:48:31.775728 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.775345 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:48:31.777612 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.777589 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/81a2dba0-4d01-448c-accb-07510f0c8197-metrics-tls\") pod \"dns-default-kqxqw\" (UID: \"81a2dba0-4d01-448c-accb-07510f0c8197\") " pod="openshift-dns/dns-default-kqxqw" Mar 18 16:48:31.777731 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.777688 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c-cert\") pod \"ingress-canary-rzxp5\" (UID: \"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c\") " pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:48:31.777885 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.777864 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19412877-e39c-4727-8a7e-12bcaf2b8450-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-kqsnz\" (UID: \"19412877-e39c-4727-8a7e-12bcaf2b8450\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:48:31.840496 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.840473 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tcvm2\"" Mar 18 16:48:31.848910 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.848884 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rzxp5" Mar 18 16:48:31.951676 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.951652 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-775g5\"" Mar 18 16:48:31.960570 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.960522 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kqxqw" Mar 18 16:48:31.961945 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:31.961924 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rzxp5"] Mar 18 16:48:31.966488 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:48:31.966466 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6b9e883_ffdb_4d9a_9f0a_5e114ac87b4c.slice/crio-cc7921eca313126a49c526424343e0007e8a18010034d1277ebb89c04b13a2e6 WatchSource:0}: Error finding container cc7921eca313126a49c526424343e0007e8a18010034d1277ebb89c04b13a2e6: Status 404 returned error can't find the container with id cc7921eca313126a49c526424343e0007e8a18010034d1277ebb89c04b13a2e6 Mar 18 16:48:32.038612 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:32.038589 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-glssf\"" Mar 18 16:48:32.047240 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:32.047215 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" Mar 18 16:48:32.072681 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:32.072654 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kqxqw"] Mar 18 16:48:32.074841 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:48:32.074816 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81a2dba0_4d01_448c_accb_07510f0c8197.slice/crio-bd1cab86c1ef0b90f6c661a9a48a97c0e6f17fe1b5da91126d8cae1273d3911f WatchSource:0}: Error finding container bd1cab86c1ef0b90f6c661a9a48a97c0e6f17fe1b5da91126d8cae1273d3911f: Status 404 returned error can't find the container with id bd1cab86c1ef0b90f6c661a9a48a97c0e6f17fe1b5da91126d8cae1273d3911f Mar 18 16:48:32.158542 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:32.158496 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-kqsnz"] Mar 18 16:48:32.161361 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:48:32.161336 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19412877_e39c_4727_8a7e_12bcaf2b8450.slice/crio-39ddbf17e57d541c6fa1a2211b8932b347eefee507607004c36716f90e3e5681 WatchSource:0}: Error finding container 39ddbf17e57d541c6fa1a2211b8932b347eefee507607004c36716f90e3e5681: Status 404 returned error can't find the container with id 39ddbf17e57d541c6fa1a2211b8932b347eefee507607004c36716f90e3e5681 Mar 18 16:48:32.663702 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:32.663662 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" event={"ID":"19412877-e39c-4727-8a7e-12bcaf2b8450","Type":"ContainerStarted","Data":"39ddbf17e57d541c6fa1a2211b8932b347eefee507607004c36716f90e3e5681"} Mar 18 16:48:32.664950 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:32.664884 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kqxqw" event={"ID":"81a2dba0-4d01-448c-accb-07510f0c8197","Type":"ContainerStarted","Data":"bd1cab86c1ef0b90f6c661a9a48a97c0e6f17fe1b5da91126d8cae1273d3911f"} Mar 18 16:48:32.666609 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:32.666583 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rzxp5" event={"ID":"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c","Type":"ContainerStarted","Data":"cc7921eca313126a49c526424343e0007e8a18010034d1277ebb89c04b13a2e6"} Mar 18 16:48:34.675638 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:34.675554 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" event={"ID":"19412877-e39c-4727-8a7e-12bcaf2b8450","Type":"ContainerStarted","Data":"40efefcc8b966c1241bcf3897765f03e0fb3663cd318946d5015e90ba57dd7bd"} Mar 18 16:48:34.677026 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:34.677003 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kqxqw" event={"ID":"81a2dba0-4d01-448c-accb-07510f0c8197","Type":"ContainerStarted","Data":"ac3c1e1f96222f0246b53f68f3754b7480e675e55d70f2a5548ae64bf0ce49b4"} Mar 18 16:48:34.677143 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:34.677029 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kqxqw" event={"ID":"81a2dba0-4d01-448c-accb-07510f0c8197","Type":"ContainerStarted","Data":"912c32bb967c655c6f2d997bd3b115bae01e7d36635e045840407a45515f7b0a"} Mar 18 16:48:34.677206 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:34.677143 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kqxqw" Mar 18 16:48:34.678161 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:34.678143 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rzxp5" event={"ID":"f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c","Type":"ContainerStarted","Data":"20ae03d140b5bf487398b52e4100ee72502c84c756f11dd92683155875eec14d"} Mar 18 16:48:34.692909 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:34.692865 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-55b77584bb-kqsnz" podStartSLOduration=280.714808042 podStartE2EDuration="4m42.692850143s" podCreationTimestamp="2026-03-18 16:43:52 +0000 UTC" firstStartedPulling="2026-03-18 16:48:32.163189968 +0000 UTC m=+282.755130497" lastFinishedPulling="2026-03-18 16:48:34.141232061 +0000 UTC m=+284.733172598" observedRunningTime="2026-03-18 16:48:34.691821999 +0000 UTC m=+285.283762550" watchObservedRunningTime="2026-03-18 16:48:34.692850143 +0000 UTC m=+285.284790694" Mar 18 16:48:34.707626 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:34.707582 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kqxqw" podStartSLOduration=251.642398784 podStartE2EDuration="4m13.707570924s" podCreationTimestamp="2026-03-18 16:44:21 +0000 UTC" firstStartedPulling="2026-03-18 16:48:32.076695634 +0000 UTC m=+282.668636163" lastFinishedPulling="2026-03-18 16:48:34.141867775 +0000 UTC m=+284.733808303" observedRunningTime="2026-03-18 16:48:34.706754685 +0000 UTC m=+285.298695235" watchObservedRunningTime="2026-03-18 16:48:34.707570924 +0000 UTC m=+285.299511480" Mar 18 16:48:34.720688 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:34.720651 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rzxp5" podStartSLOduration=251.54270877 podStartE2EDuration="4m13.720644006s" podCreationTimestamp="2026-03-18 16:44:21 +0000 UTC" firstStartedPulling="2026-03-18 16:48:31.968381795 +0000 UTC m=+282.560322326" lastFinishedPulling="2026-03-18 16:48:34.14631703 +0000 UTC m=+284.738257562" observedRunningTime="2026-03-18 16:48:34.720270851 +0000 UTC m=+285.312211440" watchObservedRunningTime="2026-03-18 16:48:34.720644006 +0000 UTC m=+285.312584557" Mar 18 16:48:44.683637 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:44.683606 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kqxqw" Mar 18 16:48:49.890149 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:49.890124 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 16:48:49.891076 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:49.891060 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 16:48:49.901161 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:48:49.901146 2563 kubelet.go:1628] "Image garbage collection succeeded" Mar 18 16:50:52.645849 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.645814 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85"] Mar 18 16:50:52.646247 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.646102 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc29a765-450c-4686-8b12-4a4977337e9f" containerName="console" Mar 18 16:50:52.646247 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.646114 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc29a765-450c-4686-8b12-4a4977337e9f" containerName="console" Mar 18 16:50:52.646247 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.646156 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc29a765-450c-4686-8b12-4a4977337e9f" containerName="console" Mar 18 16:50:52.647972 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.647958 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" Mar 18 16:50:52.650495 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.650473 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Mar 18 16:50:52.650635 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.650493 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Mar 18 16:50:52.650635 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.650583 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Mar 18 16:50:52.650751 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.650648 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Mar 18 16:50:52.650807 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.650751 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-9b7qs\"" Mar 18 16:50:52.655541 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.655520 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85"] Mar 18 16:50:52.736706 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.736685 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk"] Mar 18 16:50:52.738849 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.738835 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.740431 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.740416 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Mar 18 16:50:52.740826 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.740813 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Mar 18 16:50:52.740900 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.740836 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Mar 18 16:50:52.740900 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.740861 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Mar 18 16:50:52.742628 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.742610 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7b5ccfb677-crb85\" (UID: \"fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" Mar 18 16:50:52.742730 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.742640 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89tt5\" (UniqueName: \"kubernetes.io/projected/fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86-kube-api-access-89tt5\") pod \"managed-serviceaccount-addon-agent-7b5ccfb677-crb85\" (UID: \"fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" Mar 18 16:50:52.748241 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.748218 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk"] Mar 18 16:50:52.843405 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.843381 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-hub\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.843527 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.843418 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.843527 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.843440 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-ca\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.843527 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.843457 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fe8f1139-d084-4737-add6-e8fde250073b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.843527 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.843482 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.843527 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.843521 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7b5ccfb677-crb85\" (UID: \"fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" Mar 18 16:50:52.843685 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.843548 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89tt5\" (UniqueName: \"kubernetes.io/projected/fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86-kube-api-access-89tt5\") pod \"managed-serviceaccount-addon-agent-7b5ccfb677-crb85\" (UID: \"fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" Mar 18 16:50:52.843685 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.843567 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjkrs\" (UniqueName: \"kubernetes.io/projected/fe8f1139-d084-4737-add6-e8fde250073b-kube-api-access-bjkrs\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.845874 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.845851 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7b5ccfb677-crb85\" (UID: \"fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" Mar 18 16:50:52.850302 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.850274 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89tt5\" (UniqueName: \"kubernetes.io/projected/fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86-kube-api-access-89tt5\") pod \"managed-serviceaccount-addon-agent-7b5ccfb677-crb85\" (UID: \"fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" Mar 18 16:50:52.944476 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.944425 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-ca\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.944476 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.944452 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fe8f1139-d084-4737-add6-e8fde250073b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.944476 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.944472 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.944686 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.944615 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjkrs\" (UniqueName: \"kubernetes.io/projected/fe8f1139-d084-4737-add6-e8fde250073b-kube-api-access-bjkrs\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.944686 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.944653 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-hub\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.944686 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.944685 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.945212 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.945189 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fe8f1139-d084-4737-add6-e8fde250073b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.946656 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.946636 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.946819 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.946797 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-ca\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.946881 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.946833 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-hub\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.946985 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.946969 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fe8f1139-d084-4737-add6-e8fde250073b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.952101 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.952082 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjkrs\" (UniqueName: \"kubernetes.io/projected/fe8f1139-d084-4737-add6-e8fde250073b-kube-api-access-bjkrs\") pod \"cluster-proxy-proxy-agent-c49dcfbc7-9b9tk\" (UID: \"fe8f1139-d084-4737-add6-e8fde250073b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:52.963953 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:52.963935 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" Mar 18 16:50:53.046484 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:53.046464 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" Mar 18 16:50:53.074710 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:53.074664 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85"] Mar 18 16:50:53.078057 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:50:53.078031 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe42bfa9_1b60_44d8_bbfd_658dfdf7ca86.slice/crio-39962f38fe4fe39a7180ea0bf5570209b2bb81812f6bad669157c27f688b8329 WatchSource:0}: Error finding container 39962f38fe4fe39a7180ea0bf5570209b2bb81812f6bad669157c27f688b8329: Status 404 returned error can't find the container with id 39962f38fe4fe39a7180ea0bf5570209b2bb81812f6bad669157c27f688b8329 Mar 18 16:50:53.080139 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:53.080117 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 16:50:53.160596 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:53.160555 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk"] Mar 18 16:50:53.166236 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:50:53.166209 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe8f1139_d084_4737_add6_e8fde250073b.slice/crio-8cd07fe38c17bb7c8d5d1d4286cf56f821602d5ac4e81607ef948700bb379f3e WatchSource:0}: Error finding container 8cd07fe38c17bb7c8d5d1d4286cf56f821602d5ac4e81607ef948700bb379f3e: Status 404 returned error can't find the container with id 8cd07fe38c17bb7c8d5d1d4286cf56f821602d5ac4e81607ef948700bb379f3e Mar 18 16:50:54.044478 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:54.044424 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" event={"ID":"fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86","Type":"ContainerStarted","Data":"39962f38fe4fe39a7180ea0bf5570209b2bb81812f6bad669157c27f688b8329"} Mar 18 16:50:54.045892 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:54.045854 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" event={"ID":"fe8f1139-d084-4737-add6-e8fde250073b","Type":"ContainerStarted","Data":"8cd07fe38c17bb7c8d5d1d4286cf56f821602d5ac4e81607ef948700bb379f3e"} Mar 18 16:50:57.056086 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:57.056053 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" event={"ID":"fe8f1139-d084-4737-add6-e8fde250073b","Type":"ContainerStarted","Data":"4d6528e464362b74a6a2018439be948081aeb9390a5dcd14eb46004cb29d05eb"} Mar 18 16:50:57.057463 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:57.057437 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" event={"ID":"fe42bfa9-1b60-44d8-bbfd-658dfdf7ca86","Type":"ContainerStarted","Data":"e2ec8c26bc6f26127f06b173be47a50fe7cb785e4f8c9ad50a61c17eafe22fc1"} Mar 18 16:50:57.088667 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:50:57.088629 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b5ccfb677-crb85" podStartSLOduration=1.702621569 podStartE2EDuration="5.08861663s" podCreationTimestamp="2026-03-18 16:50:52 +0000 UTC" firstStartedPulling="2026-03-18 16:50:53.080289085 +0000 UTC m=+423.672229614" lastFinishedPulling="2026-03-18 16:50:56.466284132 +0000 UTC m=+427.058224675" observedRunningTime="2026-03-18 16:50:57.088253218 +0000 UTC m=+427.680193768" watchObservedRunningTime="2026-03-18 16:50:57.08861663 +0000 UTC m=+427.680557179" Mar 18 16:51:01.070245 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:01.070208 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" event={"ID":"fe8f1139-d084-4737-add6-e8fde250073b","Type":"ContainerStarted","Data":"f15937e9e8802e3ebd03d211d89c4c19412decc0fc3dfd07f60248397a2481ef"} Mar 18 16:51:01.070245 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:01.070249 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" event={"ID":"fe8f1139-d084-4737-add6-e8fde250073b","Type":"ContainerStarted","Data":"d84c6d3cebc2403297e6afa8c44fdcd3a9353ce73157490e4559627b196f5c24"} Mar 18 16:51:01.091860 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:01.091817 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-c49dcfbc7-9b9tk" podStartSLOduration=2.140350138 podStartE2EDuration="9.091804826s" podCreationTimestamp="2026-03-18 16:50:52 +0000 UTC" firstStartedPulling="2026-03-18 16:50:53.167868643 +0000 UTC m=+423.759809171" lastFinishedPulling="2026-03-18 16:51:00.11932333 +0000 UTC m=+430.711263859" observedRunningTime="2026-03-18 16:51:01.090859627 +0000 UTC m=+431.682800179" watchObservedRunningTime="2026-03-18 16:51:01.091804826 +0000 UTC m=+431.683745375" Mar 18 16:51:26.780229 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.780195 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-gdxz6"] Mar 18 16:51:26.783278 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.783260 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-gdxz6" Mar 18 16:51:26.785710 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.785686 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-k4mn4\"" Mar 18 16:51:26.785710 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.785700 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Mar 18 16:51:26.785871 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.785698 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Mar 18 16:51:26.785871 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.785703 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Mar 18 16:51:26.785871 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.785690 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Mar 18 16:51:26.789850 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.789799 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-gdxz6"] Mar 18 16:51:26.883766 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.883739 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/afc35ee1-cb01-425f-aa09-6a7cfd700af9-certificates\") pod \"keda-admission-cf49989db-gdxz6\" (UID: \"afc35ee1-cb01-425f-aa09-6a7cfd700af9\") " pod="openshift-keda/keda-admission-cf49989db-gdxz6" Mar 18 16:51:26.883920 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.883779 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfbk8\" (UniqueName: \"kubernetes.io/projected/afc35ee1-cb01-425f-aa09-6a7cfd700af9-kube-api-access-bfbk8\") pod \"keda-admission-cf49989db-gdxz6\" (UID: \"afc35ee1-cb01-425f-aa09-6a7cfd700af9\") " pod="openshift-keda/keda-admission-cf49989db-gdxz6" Mar 18 16:51:26.984914 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.984890 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/afc35ee1-cb01-425f-aa09-6a7cfd700af9-certificates\") pod \"keda-admission-cf49989db-gdxz6\" (UID: \"afc35ee1-cb01-425f-aa09-6a7cfd700af9\") " pod="openshift-keda/keda-admission-cf49989db-gdxz6" Mar 18 16:51:26.985038 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.984930 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfbk8\" (UniqueName: \"kubernetes.io/projected/afc35ee1-cb01-425f-aa09-6a7cfd700af9-kube-api-access-bfbk8\") pod \"keda-admission-cf49989db-gdxz6\" (UID: \"afc35ee1-cb01-425f-aa09-6a7cfd700af9\") " pod="openshift-keda/keda-admission-cf49989db-gdxz6" Mar 18 16:51:26.987293 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.987272 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/afc35ee1-cb01-425f-aa09-6a7cfd700af9-certificates\") pod \"keda-admission-cf49989db-gdxz6\" (UID: \"afc35ee1-cb01-425f-aa09-6a7cfd700af9\") " pod="openshift-keda/keda-admission-cf49989db-gdxz6" Mar 18 16:51:26.992207 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:26.992187 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfbk8\" (UniqueName: \"kubernetes.io/projected/afc35ee1-cb01-425f-aa09-6a7cfd700af9-kube-api-access-bfbk8\") pod \"keda-admission-cf49989db-gdxz6\" (UID: \"afc35ee1-cb01-425f-aa09-6a7cfd700af9\") " pod="openshift-keda/keda-admission-cf49989db-gdxz6" Mar 18 16:51:27.094295 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:27.094208 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-gdxz6" Mar 18 16:51:27.203913 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:27.203891 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-gdxz6"] Mar 18 16:51:27.206003 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:51:27.205974 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafc35ee1_cb01_425f_aa09_6a7cfd700af9.slice/crio-2db97ae9cff2f1ce4824840b32a2fd81c253afe7cb6bc2ad212f4cc5583acd0d WatchSource:0}: Error finding container 2db97ae9cff2f1ce4824840b32a2fd81c253afe7cb6bc2ad212f4cc5583acd0d: Status 404 returned error can't find the container with id 2db97ae9cff2f1ce4824840b32a2fd81c253afe7cb6bc2ad212f4cc5583acd0d Mar 18 16:51:28.142212 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:28.142178 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-gdxz6" event={"ID":"afc35ee1-cb01-425f-aa09-6a7cfd700af9","Type":"ContainerStarted","Data":"2db97ae9cff2f1ce4824840b32a2fd81c253afe7cb6bc2ad212f4cc5583acd0d"} Mar 18 16:51:43.185490 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:43.185455 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-gdxz6" event={"ID":"afc35ee1-cb01-425f-aa09-6a7cfd700af9","Type":"ContainerStarted","Data":"4b8fb7a874556e912a3f563994aaf8139a01944a2ade962200e0ff17b1db4764"} Mar 18 16:51:43.185869 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:43.185604 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-gdxz6" Mar 18 16:51:43.210442 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:51:43.210397 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-gdxz6" podStartSLOduration=1.879269051 podStartE2EDuration="17.210385874s" podCreationTimestamp="2026-03-18 16:51:26 +0000 UTC" firstStartedPulling="2026-03-18 16:51:27.207114308 +0000 UTC m=+457.799054839" lastFinishedPulling="2026-03-18 16:51:42.538231121 +0000 UTC m=+473.130171662" observedRunningTime="2026-03-18 16:51:43.208590381 +0000 UTC m=+473.800530927" watchObservedRunningTime="2026-03-18 16:51:43.210385874 +0000 UTC m=+473.802326421" Mar 18 16:52:04.190218 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:04.190187 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-gdxz6" Mar 18 16:52:32.147766 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.147732 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-748c497bc-kj2h8"] Mar 18 16:52:32.154176 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.154154 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-748c497bc-kj2h8" Mar 18 16:52:32.156315 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.156288 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Mar 18 16:52:32.156625 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.156608 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Mar 18 16:52:32.156834 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.156811 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-r6d2z\"" Mar 18 16:52:32.157067 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.157047 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Mar 18 16:52:32.157336 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.157315 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-748c497bc-kj2h8"] Mar 18 16:52:32.245334 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.245296 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xfpg\" (UniqueName: \"kubernetes.io/projected/cd5c1761-2a97-4523-a981-0528a774c21a-kube-api-access-9xfpg\") pod \"seaweedfs-748c497bc-kj2h8\" (UID: \"cd5c1761-2a97-4523-a981-0528a774c21a\") " pod="kserve/seaweedfs-748c497bc-kj2h8" Mar 18 16:52:32.245527 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.245344 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cd5c1761-2a97-4523-a981-0528a774c21a-data\") pod \"seaweedfs-748c497bc-kj2h8\" (UID: \"cd5c1761-2a97-4523-a981-0528a774c21a\") " pod="kserve/seaweedfs-748c497bc-kj2h8" Mar 18 16:52:32.346366 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.346338 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xfpg\" (UniqueName: \"kubernetes.io/projected/cd5c1761-2a97-4523-a981-0528a774c21a-kube-api-access-9xfpg\") pod \"seaweedfs-748c497bc-kj2h8\" (UID: \"cd5c1761-2a97-4523-a981-0528a774c21a\") " pod="kserve/seaweedfs-748c497bc-kj2h8" Mar 18 16:52:32.346544 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.346374 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cd5c1761-2a97-4523-a981-0528a774c21a-data\") pod \"seaweedfs-748c497bc-kj2h8\" (UID: \"cd5c1761-2a97-4523-a981-0528a774c21a\") " pod="kserve/seaweedfs-748c497bc-kj2h8" Mar 18 16:52:32.346754 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.346737 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cd5c1761-2a97-4523-a981-0528a774c21a-data\") pod \"seaweedfs-748c497bc-kj2h8\" (UID: \"cd5c1761-2a97-4523-a981-0528a774c21a\") " pod="kserve/seaweedfs-748c497bc-kj2h8" Mar 18 16:52:32.353730 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.353703 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xfpg\" (UniqueName: \"kubernetes.io/projected/cd5c1761-2a97-4523-a981-0528a774c21a-kube-api-access-9xfpg\") pod \"seaweedfs-748c497bc-kj2h8\" (UID: \"cd5c1761-2a97-4523-a981-0528a774c21a\") " pod="kserve/seaweedfs-748c497bc-kj2h8" Mar 18 16:52:32.464607 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.464526 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-748c497bc-kj2h8" Mar 18 16:52:32.578353 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:32.578322 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-748c497bc-kj2h8"] Mar 18 16:52:32.581936 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:52:32.581893 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd5c1761_2a97_4523_a981_0528a774c21a.slice/crio-53d5ac3f5065a2fca2ebb7df4513511c1eae412a05c2ca183a8cda6a8cb4ea72 WatchSource:0}: Error finding container 53d5ac3f5065a2fca2ebb7df4513511c1eae412a05c2ca183a8cda6a8cb4ea72: Status 404 returned error can't find the container with id 53d5ac3f5065a2fca2ebb7df4513511c1eae412a05c2ca183a8cda6a8cb4ea72 Mar 18 16:52:33.086931 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.086897 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-p79cf"] Mar 18 16:52:33.089355 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.089332 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" Mar 18 16:52:33.091968 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.091947 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-d6b54\"" Mar 18 16:52:33.092079 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.091947 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Mar 18 16:52:33.099050 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.099004 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-p79cf"] Mar 18 16:52:33.152889 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.152857 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02e2fe9d-0d5f-42f1-b883-1fbe230e412d-cert\") pod \"kserve-controller-manager-69d7c9bbdc-p79cf\" (UID: \"02e2fe9d-0d5f-42f1-b883-1fbe230e412d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" Mar 18 16:52:33.153298 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.152897 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7m6w\" (UniqueName: \"kubernetes.io/projected/02e2fe9d-0d5f-42f1-b883-1fbe230e412d-kube-api-access-j7m6w\") pod \"kserve-controller-manager-69d7c9bbdc-p79cf\" (UID: \"02e2fe9d-0d5f-42f1-b883-1fbe230e412d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" Mar 18 16:52:33.253376 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.253344 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02e2fe9d-0d5f-42f1-b883-1fbe230e412d-cert\") pod \"kserve-controller-manager-69d7c9bbdc-p79cf\" (UID: \"02e2fe9d-0d5f-42f1-b883-1fbe230e412d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" Mar 18 16:52:33.253573 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.253383 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7m6w\" (UniqueName: \"kubernetes.io/projected/02e2fe9d-0d5f-42f1-b883-1fbe230e412d-kube-api-access-j7m6w\") pod \"kserve-controller-manager-69d7c9bbdc-p79cf\" (UID: \"02e2fe9d-0d5f-42f1-b883-1fbe230e412d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" Mar 18 16:52:33.256087 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.256057 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02e2fe9d-0d5f-42f1-b883-1fbe230e412d-cert\") pod \"kserve-controller-manager-69d7c9bbdc-p79cf\" (UID: \"02e2fe9d-0d5f-42f1-b883-1fbe230e412d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" Mar 18 16:52:33.260842 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.260817 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7m6w\" (UniqueName: \"kubernetes.io/projected/02e2fe9d-0d5f-42f1-b883-1fbe230e412d-kube-api-access-j7m6w\") pod \"kserve-controller-manager-69d7c9bbdc-p79cf\" (UID: \"02e2fe9d-0d5f-42f1-b883-1fbe230e412d\") " pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" Mar 18 16:52:33.325363 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.325328 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-748c497bc-kj2h8" event={"ID":"cd5c1761-2a97-4523-a981-0528a774c21a","Type":"ContainerStarted","Data":"53d5ac3f5065a2fca2ebb7df4513511c1eae412a05c2ca183a8cda6a8cb4ea72"} Mar 18 16:52:33.402030 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.401998 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" Mar 18 16:52:33.536103 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:33.536069 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-69d7c9bbdc-p79cf"] Mar 18 16:52:33.540537 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:52:33.540488 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02e2fe9d_0d5f_42f1_b883_1fbe230e412d.slice/crio-39ff25f336a8132ff6bea5ae258d1724f07f47bd84a53bfceef642edbc08de15 WatchSource:0}: Error finding container 39ff25f336a8132ff6bea5ae258d1724f07f47bd84a53bfceef642edbc08de15: Status 404 returned error can't find the container with id 39ff25f336a8132ff6bea5ae258d1724f07f47bd84a53bfceef642edbc08de15 Mar 18 16:52:34.330798 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:34.330731 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" event={"ID":"02e2fe9d-0d5f-42f1-b883-1fbe230e412d","Type":"ContainerStarted","Data":"39ff25f336a8132ff6bea5ae258d1724f07f47bd84a53bfceef642edbc08de15"} Mar 18 16:52:37.342641 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:37.342608 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" event={"ID":"02e2fe9d-0d5f-42f1-b883-1fbe230e412d","Type":"ContainerStarted","Data":"9a23bcb7a5320006959b481122a590db7d9a490bc4fbf5f8963e76be514413ee"} Mar 18 16:52:37.343088 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:37.342680 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" Mar 18 16:52:37.343874 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:37.343849 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-748c497bc-kj2h8" event={"ID":"cd5c1761-2a97-4523-a981-0528a774c21a","Type":"ContainerStarted","Data":"4db7ebc5c87c1e54840fa1bfaccc3ac82c5a08b102f0972ba28cec48a1f2a2b3"} Mar 18 16:52:37.358281 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:37.358228 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" podStartSLOduration=1.5729854859999999 podStartE2EDuration="4.358211877s" podCreationTimestamp="2026-03-18 16:52:33 +0000 UTC" firstStartedPulling="2026-03-18 16:52:33.542944076 +0000 UTC m=+524.134884613" lastFinishedPulling="2026-03-18 16:52:36.32817047 +0000 UTC m=+526.920111004" observedRunningTime="2026-03-18 16:52:37.357182578 +0000 UTC m=+527.949123128" watchObservedRunningTime="2026-03-18 16:52:37.358211877 +0000 UTC m=+527.950152429" Mar 18 16:52:37.373880 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:52:37.373839 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-748c497bc-kj2h8" podStartSLOduration=1.5908645259999998 podStartE2EDuration="5.373826785s" podCreationTimestamp="2026-03-18 16:52:32 +0000 UTC" firstStartedPulling="2026-03-18 16:52:32.583160137 +0000 UTC m=+523.175100667" lastFinishedPulling="2026-03-18 16:52:36.36612239 +0000 UTC m=+526.958062926" observedRunningTime="2026-03-18 16:52:37.372896095 +0000 UTC m=+527.964836649" watchObservedRunningTime="2026-03-18 16:52:37.373826785 +0000 UTC m=+527.965767337" Mar 18 16:53:08.353911 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:08.353879 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-69d7c9bbdc-p79cf" Mar 18 16:53:47.925642 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:47.925607 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55849c9b96-vzdkp"] Mar 18 16:53:47.928251 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:47.928222 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:47.930485 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:47.930463 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pbg6k\"" Mar 18 16:53:47.930772 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:47.930743 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 18 16:53:47.930906 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:47.930871 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 18 16:53:47.930978 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:47.930874 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 18 16:53:47.930978 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:47.930966 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 18 16:53:47.931304 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:47.931284 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 18 16:53:47.936074 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:47.936029 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Mar 18 16:53:47.941488 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:47.941470 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55849c9b96-vzdkp"] Mar 18 16:53:48.095830 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.095798 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmhk6\" (UniqueName: \"kubernetes.io/projected/74693339-c91f-44d2-b06e-a8a566db8d59-kube-api-access-hmhk6\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.095976 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.095843 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-oauth-serving-cert\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.095976 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.095878 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-trusted-ca-bundle\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.095976 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.095898 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-service-ca\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.095976 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.095936 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74693339-c91f-44d2-b06e-a8a566db8d59-console-serving-cert\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.096126 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.095980 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-console-config\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.096126 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.096056 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74693339-c91f-44d2-b06e-a8a566db8d59-console-oauth-config\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.196721 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.196654 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74693339-c91f-44d2-b06e-a8a566db8d59-console-oauth-config\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.196721 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.196685 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmhk6\" (UniqueName: \"kubernetes.io/projected/74693339-c91f-44d2-b06e-a8a566db8d59-kube-api-access-hmhk6\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.196721 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.196704 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-oauth-serving-cert\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.196926 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.196739 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-trusted-ca-bundle\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.196926 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.196769 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-service-ca\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.196926 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.196810 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74693339-c91f-44d2-b06e-a8a566db8d59-console-serving-cert\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.196926 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.196847 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-console-config\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.197570 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.197541 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-service-ca\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.197696 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.197541 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-oauth-serving-cert\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.197763 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.197694 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-console-config\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.197820 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.197773 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74693339-c91f-44d2-b06e-a8a566db8d59-trusted-ca-bundle\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.199252 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.199224 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74693339-c91f-44d2-b06e-a8a566db8d59-console-oauth-config\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.199391 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.199374 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74693339-c91f-44d2-b06e-a8a566db8d59-console-serving-cert\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.204080 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.204062 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmhk6\" (UniqueName: \"kubernetes.io/projected/74693339-c91f-44d2-b06e-a8a566db8d59-kube-api-access-hmhk6\") pod \"console-55849c9b96-vzdkp\" (UID: \"74693339-c91f-44d2-b06e-a8a566db8d59\") " pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.239546 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.239526 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:48.355211 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.355186 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55849c9b96-vzdkp"] Mar 18 16:53:48.357640 ip-10-0-141-231 kubenswrapper[2563]: W0318 16:53:48.357616 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74693339_c91f_44d2_b06e_a8a566db8d59.slice/crio-872a370ee2105eaccacbb1c50450b41c763c785be0499163c770cc360c1eeddf WatchSource:0}: Error finding container 872a370ee2105eaccacbb1c50450b41c763c785be0499163c770cc360c1eeddf: Status 404 returned error can't find the container with id 872a370ee2105eaccacbb1c50450b41c763c785be0499163c770cc360c1eeddf Mar 18 16:53:48.551021 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.550937 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55849c9b96-vzdkp" event={"ID":"74693339-c91f-44d2-b06e-a8a566db8d59","Type":"ContainerStarted","Data":"8b64934f3afb8ecc0fc7a7ca4ad25329293493ebe8d1843ba04020fb8220d6d0"} Mar 18 16:53:48.551021 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.550972 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55849c9b96-vzdkp" event={"ID":"74693339-c91f-44d2-b06e-a8a566db8d59","Type":"ContainerStarted","Data":"872a370ee2105eaccacbb1c50450b41c763c785be0499163c770cc360c1eeddf"} Mar 18 16:53:48.569189 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:48.569141 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55849c9b96-vzdkp" podStartSLOduration=1.569129231 podStartE2EDuration="1.569129231s" podCreationTimestamp="2026-03-18 16:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 16:53:48.568555283 +0000 UTC m=+599.160495831" watchObservedRunningTime="2026-03-18 16:53:48.569129231 +0000 UTC m=+599.161069790" Mar 18 16:53:49.913073 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:49.913043 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 16:53:49.914903 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:49.914881 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 16:53:58.240531 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:58.240480 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:58.240531 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:58.240538 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:58.245001 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:58.244979 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:53:58.582241 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:53:58.582161 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55849c9b96-vzdkp" Mar 18 16:58:49.936525 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:58:49.936427 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 16:58:49.938681 ip-10-0-141-231 kubenswrapper[2563]: I0318 16:58:49.938654 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:03:49.957582 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:03:49.957554 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:03:49.960101 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:03:49.960079 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:08:49.977731 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:08:49.977610 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:08:49.980105 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:08:49.980083 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:13:49.996376 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:13:49.996272 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:13:50.005581 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:13:49.999903 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:18:50.020956 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:18:50.020840 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:18:50.025040 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:18:50.025019 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:23:50.043063 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:23:50.042940 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:23:50.046975 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:23:50.046951 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:28:50.062872 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:28:50.062760 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:28:50.066890 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:28:50.066865 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:33:50.085221 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:33:50.085118 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:33:50.089036 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:33:50.089018 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:34:54.723100 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:54.723019 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-svgkn_93ac321e-d058-4c0a-9994-91d01edf06db/global-pull-secret-syncer/0.log" Mar 18 17:34:54.885181 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:54.885151 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dsml5_7ef73172-0617-4d44-b9f4-f5d3832924d2/konnectivity-agent/0.log" Mar 18 17:34:54.986148 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:54.986062 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-231.ec2.internal_4c21613cafe918f7212fff6fba314410/haproxy/0.log" Mar 18 17:34:58.673026 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:58.672994 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2rhhp_ffc04bd0-9cc1-48d8-a749-36263f3c9b5a/node-exporter/0.log" Mar 18 17:34:58.696263 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:58.696239 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2rhhp_ffc04bd0-9cc1-48d8-a749-36263f3c9b5a/kube-rbac-proxy/0.log" Mar 18 17:34:58.723603 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:58.723584 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2rhhp_ffc04bd0-9cc1-48d8-a749-36263f3c9b5a/init-textfile/0.log" Mar 18 17:34:58.920165 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:58.920118 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-f9vqn_03cfcb33-b824-4a9e-adb6-97fc3f3f59dc/kube-rbac-proxy-main/0.log" Mar 18 17:34:58.948119 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:58.948064 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-f9vqn_03cfcb33-b824-4a9e-adb6-97fc3f3f59dc/kube-rbac-proxy-self/0.log" Mar 18 17:34:58.980356 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:58.980335 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-f9vqn_03cfcb33-b824-4a9e-adb6-97fc3f3f59dc/openshift-state-metrics/0.log" Mar 18 17:34:59.383993 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:59.383965 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-746897889c-jk4jx_4f60fdfc-0b54-4fb2-9481-e8f451e1ed37/thanos-query/0.log" Mar 18 17:34:59.408236 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:59.408214 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-746897889c-jk4jx_4f60fdfc-0b54-4fb2-9481-e8f451e1ed37/kube-rbac-proxy-web/0.log" Mar 18 17:34:59.430797 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:59.430765 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-746897889c-jk4jx_4f60fdfc-0b54-4fb2-9481-e8f451e1ed37/kube-rbac-proxy/0.log" Mar 18 17:34:59.455536 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:59.455515 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-746897889c-jk4jx_4f60fdfc-0b54-4fb2-9481-e8f451e1ed37/prom-label-proxy/0.log" Mar 18 17:34:59.480932 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:59.480901 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-746897889c-jk4jx_4f60fdfc-0b54-4fb2-9481-e8f451e1ed37/kube-rbac-proxy-rules/0.log" Mar 18 17:34:59.504382 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:34:59.504360 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-746897889c-jk4jx_4f60fdfc-0b54-4fb2-9481-e8f451e1ed37/kube-rbac-proxy-metrics/0.log" Mar 18 17:35:00.732601 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:00.732558 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-55b77584bb-kqsnz_19412877-e39c-4727-8a7e-12bcaf2b8450/networking-console-plugin/0.log" Mar 18 17:35:01.190520 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:01.190474 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/2.log" Mar 18 17:35:01.197912 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:01.197885 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-cbztj_f07a196d-57d0-4b38-a848-ed2272802021/console-operator/3.log" Mar 18 17:35:01.585712 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:01.585628 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55849c9b96-vzdkp_74693339-c91f-44d2-b06e-a8a566db8d59/console/0.log" Mar 18 17:35:01.626853 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:01.626830 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-5b85974fd6-rkfll_f544a96a-2222-4cef-9b87-65de64826ccb/download-server/0.log" Mar 18 17:35:02.526079 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.526045 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj"] Mar 18 17:35:02.528453 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.528438 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.530483 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.530461 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gffj5\"/\"default-dockercfg-4qhj7\"" Mar 18 17:35:02.530641 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.530490 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gffj5\"/\"kube-root-ca.crt\"" Mar 18 17:35:02.530641 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.530543 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gffj5\"/\"openshift-service-ca.crt\"" Mar 18 17:35:02.538518 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.538484 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj"] Mar 18 17:35:02.661656 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.661627 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-proc\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.661848 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.661669 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl5xn\" (UniqueName: \"kubernetes.io/projected/ee5a714f-978d-40b7-acc0-93695629a25c-kube-api-access-pl5xn\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.661848 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.661782 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-lib-modules\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.661848 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.661825 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-sys\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.661848 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.661846 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-podres\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.736448 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.736417 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kqxqw_81a2dba0-4d01-448c-accb-07510f0c8197/dns/0.log" Mar 18 17:35:02.760981 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.760949 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kqxqw_81a2dba0-4d01-448c-accb-07510f0c8197/kube-rbac-proxy/0.log" Mar 18 17:35:02.763043 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.763025 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-sys\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.763110 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.763051 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-podres\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.763110 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.763074 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-proc\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.763110 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.763104 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pl5xn\" (UniqueName: \"kubernetes.io/projected/ee5a714f-978d-40b7-acc0-93695629a25c-kube-api-access-pl5xn\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.763224 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.763137 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-sys\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.763224 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.763157 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-proc\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.763224 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.763192 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-lib-modules\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.763224 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.763197 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-podres\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.763361 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.763295 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee5a714f-978d-40b7-acc0-93695629a25c-lib-modules\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.770533 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.770513 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl5xn\" (UniqueName: \"kubernetes.io/projected/ee5a714f-978d-40b7-acc0-93695629a25c-kube-api-access-pl5xn\") pod \"perf-node-gather-daemonset-qhlkj\" (UID: \"ee5a714f-978d-40b7-acc0-93695629a25c\") " pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.838204 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.838135 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:02.868787 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.868762 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mlpnc_d2859c82-5f61-4503-bb69-2147a77ca895/dns-node-resolver/0.log" Mar 18 17:35:02.959788 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.959764 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj"] Mar 18 17:35:02.961513 ip-10-0-141-231 kubenswrapper[2563]: W0318 17:35:02.961469 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podee5a714f_978d_40b7_acc0_93695629a25c.slice/crio-39f8b7adead297a9502afe94398157046376c42412d41c3537223b443e95cb23 WatchSource:0}: Error finding container 39f8b7adead297a9502afe94398157046376c42412d41c3537223b443e95cb23: Status 404 returned error can't find the container with id 39f8b7adead297a9502afe94398157046376c42412d41c3537223b443e95cb23 Mar 18 17:35:02.963061 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:02.963045 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 17:35:03.363548 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:03.363511 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" event={"ID":"ee5a714f-978d-40b7-acc0-93695629a25c","Type":"ContainerStarted","Data":"e6dcfe8cd7b0f36c5aa1d3c13d79792bd287d58a388228cc3e2c0f935d73e5ac"} Mar 18 17:35:03.363548 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:03.363552 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" event={"ID":"ee5a714f-978d-40b7-acc0-93695629a25c","Type":"ContainerStarted","Data":"39f8b7adead297a9502afe94398157046376c42412d41c3537223b443e95cb23"} Mar 18 17:35:03.363755 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:03.363652 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:03.380152 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:03.380108 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" podStartSLOduration=1.3800963130000001 podStartE2EDuration="1.380096313s" podCreationTimestamp="2026-03-18 17:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 17:35:03.378743485 +0000 UTC m=+3073.970684037" watchObservedRunningTime="2026-03-18 17:35:03.380096313 +0000 UTC m=+3073.972036863" Mar 18 17:35:03.404559 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:03.404490 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-f9chd_4aac1f96-b5de-49d5-84a3-176806d103dc/node-ca/0.log" Mar 18 17:35:04.527824 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:04.527792 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rzxp5_f6b9e883-ffdb-4d9a-9f0a-5e114ac87b4c/serve-healthcheck-canary/0.log" Mar 18 17:35:04.912244 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:04.912216 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dhxqp_b785a821-c366-4ed7-8772-a55ce63347cb/kube-rbac-proxy/0.log" Mar 18 17:35:04.934768 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:04.934746 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dhxqp_b785a821-c366-4ed7-8772-a55ce63347cb/exporter/0.log" Mar 18 17:35:04.959965 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:04.959945 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dhxqp_b785a821-c366-4ed7-8772-a55ce63347cb/extractor/0.log" Mar 18 17:35:07.030262 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:07.030233 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-69d7c9bbdc-p79cf_02e2fe9d-0d5f-42f1-b883-1fbe230e412d/manager/0.log" Mar 18 17:35:07.661814 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:07.661786 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-748c497bc-kj2h8_cd5c1761-2a97-4523-a981-0528a774c21a/seaweedfs/0.log" Mar 18 17:35:09.376260 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:09.376234 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gffj5/perf-node-gather-daemonset-qhlkj" Mar 18 17:35:11.478326 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:11.478298 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-676zt_e701bc5a-b789-4dcd-9e11-12dadc2022b2/migrator/0.log" Mar 18 17:35:11.502250 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:11.502227 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-6b589cdcc-676zt_e701bc5a-b789-4dcd-9e11-12dadc2022b2/graceful-termination/0.log" Mar 18 17:35:13.042737 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:13.042712 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jxfqg_ef8cdb92-dfe0-4b41-8256-a466ea85d67a/kube-multus-additional-cni-plugins/0.log" Mar 18 17:35:13.065298 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:13.065273 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jxfqg_ef8cdb92-dfe0-4b41-8256-a466ea85d67a/egress-router-binary-copy/0.log" Mar 18 17:35:13.088683 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:13.088665 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jxfqg_ef8cdb92-dfe0-4b41-8256-a466ea85d67a/cni-plugins/0.log" Mar 18 17:35:13.110524 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:13.110481 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jxfqg_ef8cdb92-dfe0-4b41-8256-a466ea85d67a/bond-cni-plugin/0.log" Mar 18 17:35:13.132057 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:13.132039 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jxfqg_ef8cdb92-dfe0-4b41-8256-a466ea85d67a/routeoverride-cni/0.log" Mar 18 17:35:13.155420 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:13.155400 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jxfqg_ef8cdb92-dfe0-4b41-8256-a466ea85d67a/whereabouts-cni-bincopy/0.log" Mar 18 17:35:13.177102 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:13.177081 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jxfqg_ef8cdb92-dfe0-4b41-8256-a466ea85d67a/whereabouts-cni/0.log" Mar 18 17:35:13.397709 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:13.397685 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-glstl_777f8e63-1d9b-4424-b5e7-62a6ccb4658f/kube-multus/0.log" Mar 18 17:35:13.556685 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:13.556657 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kbxwz_c0b2d8c5-09a4-472a-aad2-fa033be042f3/network-metrics-daemon/0.log" Mar 18 17:35:13.579857 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:13.579836 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kbxwz_c0b2d8c5-09a4-472a-aad2-fa033be042f3/kube-rbac-proxy/0.log" Mar 18 17:35:14.481127 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:14.481100 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gjw2_c2da91ba-0645-46a4-a59d-b7219ba40de9/ovn-controller/0.log" Mar 18 17:35:14.528290 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:14.528265 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gjw2_c2da91ba-0645-46a4-a59d-b7219ba40de9/ovn-acl-logging/0.log" Mar 18 17:35:14.555061 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:14.555038 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gjw2_c2da91ba-0645-46a4-a59d-b7219ba40de9/kube-rbac-proxy-node/0.log" Mar 18 17:35:14.589468 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:14.589449 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gjw2_c2da91ba-0645-46a4-a59d-b7219ba40de9/kube-rbac-proxy-ovn-metrics/0.log" Mar 18 17:35:14.609479 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:14.609462 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gjw2_c2da91ba-0645-46a4-a59d-b7219ba40de9/northd/0.log" Mar 18 17:35:14.633164 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:14.633143 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gjw2_c2da91ba-0645-46a4-a59d-b7219ba40de9/nbdb/0.log" Mar 18 17:35:14.657584 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:14.657564 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gjw2_c2da91ba-0645-46a4-a59d-b7219ba40de9/sbdb/0.log" Mar 18 17:35:14.808361 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:14.808291 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9gjw2_c2da91ba-0645-46a4-a59d-b7219ba40de9/ovnkube-controller/0.log" Mar 18 17:35:16.654560 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:16.654531 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-s8rrk_fde7e0f5-379e-4950-a766-5b94afe18049/network-check-target-container/0.log" Mar 18 17:35:17.622222 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:17.622187 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-nztqh_2e2ef5c8-0952-4eef-a0d9-19656f20a5a5/iptables-alerter/0.log" Mar 18 17:35:18.291459 ip-10-0-141-231 kubenswrapper[2563]: I0318 17:35:18.291426 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lxcdc_41131912-91a1-43ad-a23a-203bd6091794/tuned/0.log"