May 06 17:10:08.313614 ip-10-0-135-110 systemd[1]: Starting Kubernetes Kubelet... May 06 17:10:08.752933 ip-10-0-135-110 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 06 17:10:08.752933 ip-10-0-135-110 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. May 06 17:10:08.752933 ip-10-0-135-110 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 06 17:10:08.752933 ip-10-0-135-110 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 06 17:10:08.752933 ip-10-0-135-110 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 06 17:10:08.753857 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.753778 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 06 17:10:08.756572 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756558 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 06 17:10:08.756572 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756572 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756575 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756580 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756583 2576 feature_gate.go:328] unrecognized feature gate: Example2 May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756586 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756589 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756592 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756594 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756597 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756600 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756602 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756605 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756613 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756616 2576 feature_gate.go:328] unrecognized feature gate: NewOLM May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756618 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756621 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756624 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756626 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756629 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756631 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 06 17:10:08.756631 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756635 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756638 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756641 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756644 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756647 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756649 2576 feature_gate.go:328] unrecognized feature gate: DualReplica May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756652 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756654 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756656 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756658 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756661 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756663 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756667 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756671 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756674 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756677 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756679 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756682 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756684 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756686 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 06 17:10:08.757083 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756689 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756692 2576 feature_gate.go:328] unrecognized feature gate: Example May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756695 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756697 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756699 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756701 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756704 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756707 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756711 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756713 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756716 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756719 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756721 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756724 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756727 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756730 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756732 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756735 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756738 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 06 17:10:08.757563 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756740 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756742 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756745 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756747 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756750 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756752 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756754 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756757 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756759 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756762 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756764 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756767 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756769 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756773 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756777 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756780 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756783 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756786 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756788 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 06 17:10:08.758034 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756791 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756794 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756797 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756799 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756802 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756804 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.756806 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757140 2576 feature_gate.go:328] unrecognized feature gate: NewOLM May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757145 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757147 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757150 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757153 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757155 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757158 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757160 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757163 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757166 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757168 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757171 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757173 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 06 17:10:08.758496 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757176 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757178 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757180 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757183 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757185 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757187 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757190 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757192 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757195 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757198 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757201 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757206 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757209 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757212 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757215 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757217 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757220 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757223 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757249 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 06 17:10:08.758954 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757253 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757257 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757261 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757264 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757266 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757269 2576 feature_gate.go:328] unrecognized feature gate: Example May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757271 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757274 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757277 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757279 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757281 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757284 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757286 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757290 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757293 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757295 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757297 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757300 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757302 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757305 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 06 17:10:08.759415 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757307 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757309 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757312 2576 feature_gate.go:328] unrecognized feature gate: DualReplica May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757315 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757318 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757321 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757323 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757326 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757328 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757331 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757335 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757337 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757339 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757342 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757345 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757347 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757349 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757352 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757355 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 06 17:10:08.759878 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757358 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757360 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757363 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757366 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757368 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757371 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757373 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757376 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757378 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757380 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757383 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757385 2576 feature_gate.go:328] unrecognized feature gate: Example2 May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757388 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757390 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.757392 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.758979 2576 flags.go:64] FLAG: --address="0.0.0.0" May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.758992 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759000 2576 flags.go:64] FLAG: --anonymous-auth="true" May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759005 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759009 2576 flags.go:64] FLAG: --authentication-token-webhook="false" May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759012 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" May 06 17:10:08.760344 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759016 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759020 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759024 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759027 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759030 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759033 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759036 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759039 2576 flags.go:64] FLAG: --cgroup-root="" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759042 2576 flags.go:64] FLAG: --cgroups-per-qos="true" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759045 2576 flags.go:64] FLAG: --client-ca-file="" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759047 2576 flags.go:64] FLAG: --cloud-config="" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759050 2576 flags.go:64] FLAG: --cloud-provider="external" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759053 2576 flags.go:64] FLAG: --cluster-dns="[]" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759057 2576 flags.go:64] FLAG: --cluster-domain="" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759059 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759062 2576 flags.go:64] FLAG: --config-dir="" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759065 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759068 2576 flags.go:64] FLAG: --container-log-max-files="5" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759072 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759075 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759078 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759081 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759084 2576 flags.go:64] FLAG: --contention-profiling="false" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759087 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" May 06 17:10:08.760846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759090 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759093 2576 flags.go:64] FLAG: --cpu-manager-policy="none" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759096 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759100 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759104 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759107 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759110 2576 flags.go:64] FLAG: --enable-load-reader="false" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759113 2576 flags.go:64] FLAG: --enable-server="true" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759115 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759120 2576 flags.go:64] FLAG: --event-burst="100" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759123 2576 flags.go:64] FLAG: --event-qps="50" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759126 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759129 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759132 2576 flags.go:64] FLAG: --eviction-hard="" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759136 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759139 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759141 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759144 2576 flags.go:64] FLAG: --eviction-soft="" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759147 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759150 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759153 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759156 2576 flags.go:64] FLAG: --experimental-mounter-path="" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759159 2576 flags.go:64] FLAG: --fail-cgroupv1="false" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759162 2576 flags.go:64] FLAG: --fail-swap-on="true" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759165 2576 flags.go:64] FLAG: --feature-gates="" May 06 17:10:08.761428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759168 2576 flags.go:64] FLAG: --file-check-frequency="20s" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759171 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759174 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759178 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759180 2576 flags.go:64] FLAG: --healthz-port="10248" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759183 2576 flags.go:64] FLAG: --help="false" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759186 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-135-110.ec2.internal" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759189 2576 flags.go:64] FLAG: --housekeeping-interval="10s" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759192 2576 flags.go:64] FLAG: --http-check-frequency="20s" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759195 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759198 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759202 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759205 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759208 2576 flags.go:64] FLAG: --image-service-endpoint="" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759211 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759214 2576 flags.go:64] FLAG: --kube-api-burst="100" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759217 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759220 2576 flags.go:64] FLAG: --kube-api-qps="50" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759222 2576 flags.go:64] FLAG: --kube-reserved="" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759236 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759239 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759242 2576 flags.go:64] FLAG: --kubelet-cgroups="" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759245 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759248 2576 flags.go:64] FLAG: --lock-file="" May 06 17:10:08.762021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759251 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759254 2576 flags.go:64] FLAG: --log-flush-frequency="5s" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759257 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759262 2576 flags.go:64] FLAG: --log-json-split-stream="false" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759265 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759268 2576 flags.go:64] FLAG: --log-text-split-stream="false" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759271 2576 flags.go:64] FLAG: --logging-format="text" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759274 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759277 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759280 2576 flags.go:64] FLAG: --manifest-url="" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759283 2576 flags.go:64] FLAG: --manifest-url-header="" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759287 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759290 2576 flags.go:64] FLAG: --max-open-files="1000000" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759295 2576 flags.go:64] FLAG: --max-pods="110" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759297 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759300 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759304 2576 flags.go:64] FLAG: --memory-manager-policy="None" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759307 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759309 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759312 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759315 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759322 2576 flags.go:64] FLAG: --node-status-max-images="50" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759326 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759329 2576 flags.go:64] FLAG: --oom-score-adj="-999" May 06 17:10:08.762603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759332 2576 flags.go:64] FLAG: --pod-cidr="" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759334 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3fc6c2cc09f271efd3cd2adb6c984c7cab48ea53dad824c952dee91afa8eaa20" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759340 2576 flags.go:64] FLAG: --pod-manifest-path="" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759342 2576 flags.go:64] FLAG: --pod-max-pids="-1" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759345 2576 flags.go:64] FLAG: --pods-per-core="0" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759348 2576 flags.go:64] FLAG: --port="10250" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759351 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759354 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e803e88e32467738" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759357 2576 flags.go:64] FLAG: --qos-reserved="" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759360 2576 flags.go:64] FLAG: --read-only-port="10255" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759362 2576 flags.go:64] FLAG: --register-node="true" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759365 2576 flags.go:64] FLAG: --register-schedulable="true" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759368 2576 flags.go:64] FLAG: --register-with-taints="" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759372 2576 flags.go:64] FLAG: --registry-burst="10" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759375 2576 flags.go:64] FLAG: --registry-qps="5" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759378 2576 flags.go:64] FLAG: --reserved-cpus="" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759380 2576 flags.go:64] FLAG: --reserved-memory="" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759384 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759386 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759389 2576 flags.go:64] FLAG: --rotate-certificates="false" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759392 2576 flags.go:64] FLAG: --rotate-server-certificates="false" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759394 2576 flags.go:64] FLAG: --runonce="false" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759401 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759404 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759407 2576 flags.go:64] FLAG: --seccomp-default="false" May 06 17:10:08.763194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759410 2576 flags.go:64] FLAG: --serialize-image-pulls="true" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759412 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759415 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759419 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759422 2576 flags.go:64] FLAG: --storage-driver-password="root" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759425 2576 flags.go:64] FLAG: --storage-driver-secure="false" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759429 2576 flags.go:64] FLAG: --storage-driver-table="stats" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759431 2576 flags.go:64] FLAG: --storage-driver-user="root" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759434 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759437 2576 flags.go:64] FLAG: --sync-frequency="1m0s" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759440 2576 flags.go:64] FLAG: --system-cgroups="" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759443 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759448 2576 flags.go:64] FLAG: --system-reserved-cgroup="" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759451 2576 flags.go:64] FLAG: --tls-cert-file="" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759453 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759457 2576 flags.go:64] FLAG: --tls-min-version="" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759460 2576 flags.go:64] FLAG: --tls-private-key-file="" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759462 2576 flags.go:64] FLAG: --topology-manager-policy="none" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759468 2576 flags.go:64] FLAG: --topology-manager-policy-options="" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759470 2576 flags.go:64] FLAG: --topology-manager-scope="container" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759474 2576 flags.go:64] FLAG: --v="2" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759477 2576 flags.go:64] FLAG: --version="false" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759481 2576 flags.go:64] FLAG: --vmodule="" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759485 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759488 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" May 06 17:10:08.763802 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759590 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759595 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759598 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759601 2576 feature_gate.go:328] unrecognized feature gate: NewOLM May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759605 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759607 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759610 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759613 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759615 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759617 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759620 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759623 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759625 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759629 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759631 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759634 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759636 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759638 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759640 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759643 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 06 17:10:08.764412 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759645 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759648 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759650 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759653 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759655 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759659 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759661 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759664 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759666 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759668 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759671 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759673 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759676 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759678 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759681 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759683 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759687 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759690 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759692 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 06 17:10:08.764895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759695 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759697 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759700 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759702 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759705 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759707 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759710 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759713 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759715 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759717 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759720 2576 feature_gate.go:328] unrecognized feature gate: DualReplica May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759722 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759725 2576 feature_gate.go:328] unrecognized feature gate: Example May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759728 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759730 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759732 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759735 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759737 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759741 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759744 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 06 17:10:08.765382 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759748 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759751 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759754 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759756 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759759 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759761 2576 feature_gate.go:328] unrecognized feature gate: Example2 May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759764 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759766 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759770 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759775 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759778 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759781 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759784 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759786 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759789 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759791 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759794 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759796 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759799 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759801 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 06 17:10:08.765957 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759804 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759807 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759809 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759812 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759814 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759816 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.759819 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.759824 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.765802 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.10" May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.765821 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.765968 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.765975 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.765981 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.765986 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.765991 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766001 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores May 06 17:10:08.766737 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766007 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766013 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766019 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766024 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766028 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766032 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766036 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766041 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766045 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766049 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766054 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766062 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766067 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766071 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766075 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766079 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766083 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766087 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766091 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766095 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 06 17:10:08.767125 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766099 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766103 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766107 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766111 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766119 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766126 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766130 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766134 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766139 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766143 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766147 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766151 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766155 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766159 2576 feature_gate.go:328] unrecognized feature gate: DualReplica May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766163 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766167 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766171 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766180 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766184 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766188 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 06 17:10:08.767620 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766192 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766196 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766200 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766204 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766208 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766212 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766216 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766253 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766259 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766270 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766276 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766281 2576 feature_gate.go:328] unrecognized feature gate: Example May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766286 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766290 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766294 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766299 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766302 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766307 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766313 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 06 17:10:08.768099 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766317 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766321 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766330 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766334 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766339 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766343 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766347 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766351 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766355 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766359 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766363 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766367 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766371 2576 feature_gate.go:328] unrecognized feature gate: Example2 May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766375 2576 feature_gate.go:328] unrecognized feature gate: NewOLM May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766379 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766388 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766393 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766396 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766400 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766404 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 06 17:10:08.768562 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766408 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.766417 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766828 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766841 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766845 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766848 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766852 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766854 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766857 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766860 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766863 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766866 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766868 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766871 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766874 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 06 17:10:08.769023 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766877 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766880 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766882 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766886 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766888 2576 feature_gate.go:328] unrecognized feature gate: Example2 May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766891 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766894 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766896 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766899 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766901 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766903 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766907 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766909 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766912 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766914 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766917 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766919 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766923 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766927 2576 feature_gate.go:328] unrecognized feature gate: DualReplica May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766930 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 06 17:10:08.769394 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766934 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766937 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766939 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766942 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766944 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766946 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766949 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766951 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766953 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766956 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766959 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766961 2576 feature_gate.go:328] unrecognized feature gate: Example May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766964 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766966 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766968 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766971 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766973 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766976 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766978 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766980 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 06 17:10:08.769918 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766983 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766985 2576 feature_gate.go:328] unrecognized feature gate: NewOLM May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766987 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766990 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766992 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766994 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766997 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.766999 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767002 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767004 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767007 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767009 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767012 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767015 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767017 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767019 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767022 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767024 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767027 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 06 17:10:08.770398 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767029 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767032 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767036 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767039 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767041 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767043 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767046 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767048 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767051 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767053 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767055 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767058 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767060 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:08.767062 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.767068 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 06 17:10:08.770830 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.767690 2576 server.go:962] "Client rotation is on, will bootstrap in background" May 06 17:10:08.773760 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.773747 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" May 06 17:10:08.774638 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.774627 2576 server.go:1019] "Starting client certificate rotation" May 06 17:10:08.774738 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.774722 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" May 06 17:10:08.774769 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.774764 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" May 06 17:10:08.798594 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.798579 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" May 06 17:10:08.800868 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.800850 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" May 06 17:10:08.812646 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.812630 2576 log.go:25] "Validated CRI v1 runtime API" May 06 17:10:08.818743 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.818727 2576 log.go:25] "Validated CRI v1 image API" May 06 17:10:08.820457 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.820443 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 06 17:10:08.824590 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.824566 2576 fs.go:135] Filesystem UUIDs: map[44adc4c3-029f-4e04-8085-3e22fdcc1c60:/dev/nvme0n1p3 62882b83-f3cd-4204-8bff-19190add74dc:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] May 06 17:10:08.824651 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.824590 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] May 06 17:10:08.831718 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.831442 2576 manager.go:217] Machine: {Timestamp:2026-05-06 17:10:08.829574394 +0000 UTC m=+0.392729476 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3203844 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec273439a6d675fcd1259117bc805e49 SystemUUID:ec273439-a6d6-75fc-d125-9117bc805e49 BootID:5b6e163a-40df-4285-b45e-27de47fe21b6 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:55:10:72:f4:8d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:55:10:72:f4:8d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:12:b1:c6:a4:1c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} May 06 17:10:08.831718 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.831708 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. May 06 17:10:08.831870 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.831808 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.112.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260504-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} May 06 17:10:08.831969 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.831952 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" May 06 17:10:08.832943 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.832921 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 06 17:10:08.833089 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.832945 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-110.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 06 17:10:08.833206 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.833101 2576 topology_manager.go:138] "Creating topology manager with none policy" May 06 17:10:08.833206 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.833114 2576 container_manager_linux.go:306] "Creating device plugin manager" May 06 17:10:08.833206 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.833132 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" May 06 17:10:08.834832 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.834817 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" May 06 17:10:08.836076 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.836064 2576 state_mem.go:36] "Initialized new in-memory state store" May 06 17:10:08.836348 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.836337 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" May 06 17:10:08.839063 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.839052 2576 kubelet.go:491] "Attempting to sync node with API server" May 06 17:10:08.839859 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.839847 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" May 06 17:10:08.839925 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.839878 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" May 06 17:10:08.839925 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.839891 2576 kubelet.go:397] "Adding apiserver pod source" May 06 17:10:08.839925 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.839903 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 06 17:10:08.840943 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.840931 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" May 06 17:10:08.841002 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.840952 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" May 06 17:10:08.843858 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.843844 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.11-2.rhaos4.20.gitb2a8320.el9" apiVersion="v1" May 06 17:10:08.844983 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.844968 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 06 17:10:08.845810 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845797 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" May 06 17:10:08.845884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845815 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" May 06 17:10:08.845884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845824 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" May 06 17:10:08.845884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845832 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" May 06 17:10:08.845884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845841 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" May 06 17:10:08.845884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845849 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" May 06 17:10:08.845884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845857 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" May 06 17:10:08.845884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845866 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" May 06 17:10:08.845884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845876 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" May 06 17:10:08.845884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845884 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" May 06 17:10:08.846117 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.845902 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" May 06 17:10:08.846175 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.846164 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" May 06 17:10:08.848027 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.848014 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" May 06 17:10:08.848027 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.848029 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" May 06 17:10:08.851319 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.851306 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 06 17:10:08.851402 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.851377 2576 server.go:1295] "Started kubelet" May 06 17:10:08.851481 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.851456 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 06 17:10:08.851564 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.851517 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 06 17:10:08.851602 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.851584 2576 server_v1.go:47] "podresources" method="list" useActivePods=true May 06 17:10:08.852062 ip-10-0-135-110 systemd[1]: Started Kubernetes Kubelet. May 06 17:10:08.852588 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.852466 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-110.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope May 06 17:10:08.852588 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.852553 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-110.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 06 17:10:08.852588 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.852559 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 06 17:10:08.852733 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.852605 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 06 17:10:08.854516 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.854500 2576 server.go:317] "Adding debug handlers to kubelet server" May 06 17:10:08.858115 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.857204 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-110.ec2.internal.18ad0913e02d72ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-110.ec2.internal,UID:ip-10-0-135-110.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-110.ec2.internal,},FirstTimestamp:2026-05-06 17:10:08.85131742 +0000 UTC m=+0.414472503,LastTimestamp:2026-05-06 17:10:08.85131742 +0000 UTC m=+0.414472503,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-110.ec2.internal,}" May 06 17:10:08.859086 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.859071 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" May 06 17:10:08.859446 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.859427 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 06 17:10:08.859774 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.859742 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" May 06 17:10:08.860084 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.860067 2576 volume_manager.go:295] "The desired_state_of_world populator starts" May 06 17:10:08.860200 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.860181 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" May 06 17:10:08.860291 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.860084 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 06 17:10:08.860343 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.860323 2576 reconstruct.go:97] "Volume reconstruction finished" May 06 17:10:08.860343 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.860331 2576 reconciler.go:26] "Reconciler: start to sync state" May 06 17:10:08.860450 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.860421 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:08.861714 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.861697 2576 factory.go:153] Registering CRI-O factory May 06 17:10:08.861784 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.861764 2576 factory.go:223] Registration of the crio container factory successfully May 06 17:10:08.861830 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.861824 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory May 06 17:10:08.861884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.861833 2576 factory.go:55] Registering systemd factory May 06 17:10:08.861884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.861840 2576 factory.go:223] Registration of the systemd container factory successfully May 06 17:10:08.861884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.861862 2576 factory.go:103] Registering Raw factory May 06 17:10:08.861884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.861876 2576 manager.go:1196] Started watching for new ooms in manager May 06 17:10:08.862478 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.862426 2576 manager.go:319] Starting recovery of all containers May 06 17:10:08.864846 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.864820 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" May 06 17:10:08.865353 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.865312 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-110.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" May 06 17:10:08.870721 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.870583 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 06 17:10:08.874788 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.874775 2576 manager.go:324] Recovery completed May 06 17:10:08.876398 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.876376 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7wgxr" May 06 17:10:08.878688 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.878677 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:08.880882 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.880869 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:08.880954 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.880892 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:08.880954 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.880902 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:08.881375 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.881360 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" May 06 17:10:08.881375 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.881374 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" May 06 17:10:08.881454 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.881387 2576 state_mem.go:36] "Initialized new in-memory state store" May 06 17:10:08.882841 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.882769 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-110.ec2.internal.18ad0913e1f08eeb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-110.ec2.internal,UID:ip-10-0-135-110.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-110.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-110.ec2.internal,},FirstTimestamp:2026-05-06 17:10:08.880881387 +0000 UTC m=+0.444036468,LastTimestamp:2026-05-06 17:10:08.880881387 +0000 UTC m=+0.444036468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-110.ec2.internal,}" May 06 17:10:08.883409 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.883398 2576 policy_none.go:49] "None policy: Start" May 06 17:10:08.883443 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.883413 2576 memory_manager.go:186] "Starting memorymanager" policy="None" May 06 17:10:08.883443 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.883423 2576 state_mem.go:35] "Initializing new in-memory state store" May 06 17:10:08.884582 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.884569 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7wgxr" May 06 17:10:08.922031 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.921796 2576 manager.go:341] "Starting Device Plugin manager" May 06 17:10:08.922031 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.921822 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 06 17:10:08.922031 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.921835 2576 server.go:85] "Starting device plugin registration server" May 06 17:10:08.922204 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.922054 2576 eviction_manager.go:189] "Eviction manager: starting control loop" May 06 17:10:08.922204 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.922086 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 06 17:10:08.922204 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.922147 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" May 06 17:10:08.922370 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.922252 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" May 06 17:10:08.922370 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.922263 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 06 17:10:08.922756 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.922737 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" May 06 17:10:08.922837 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.922774 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:08.945396 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.945380 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 06 17:10:08.945491 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.945408 2576 status_manager.go:230] "Starting to sync pod status with apiserver" May 06 17:10:08.945491 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.945427 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 06 17:10:08.945491 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.945438 2576 kubelet.go:2451] "Starting kubelet main sync loop" May 06 17:10:08.945491 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:08.945473 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" May 06 17:10:08.947637 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:08.947622 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:09.022706 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.022650 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:09.023560 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.023545 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:09.023645 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.023578 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:09.023645 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.023595 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:09.023645 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.023623 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-110.ec2.internal" May 06 17:10:09.033680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.033667 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-110.ec2.internal" May 06 17:10:09.033737 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.033688 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-110.ec2.internal\": node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:09.045568 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.045548 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal"] May 06 17:10:09.045657 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.045612 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:09.047100 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.047086 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:09.047180 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.047117 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:09.047180 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.047131 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:09.048726 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.048713 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:09.048899 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.048886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" May 06 17:10:09.048985 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.048913 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:09.049107 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.049090 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:09.049371 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.049352 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:09.049371 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.049374 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:09.049514 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.049386 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:09.049514 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.049430 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:09.049514 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.049446 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:09.049514 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.049456 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:09.050753 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.050740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal" May 06 17:10:09.050801 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.050762 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 06 17:10:09.051814 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.051800 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientMemory" May 06 17:10:09.051870 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.051826 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasNoDiskPressure" May 06 17:10:09.051870 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.051837 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeHasSufficientPID" May 06 17:10:09.083800 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.083783 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-110.ec2.internal\" not found" node="ip-10-0-135-110.ec2.internal" May 06 17:10:09.088072 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.088059 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-110.ec2.internal\" not found" node="ip-10-0-135-110.ec2.internal" May 06 17:10:09.149734 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.149717 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:09.161051 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.161034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/69f28b5eb96ffb4da7325eae53e8acbd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal\" (UID: \"69f28b5eb96ffb4da7325eae53e8acbd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" May 06 17:10:09.161098 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.161056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69f28b5eb96ffb4da7325eae53e8acbd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal\" (UID: \"69f28b5eb96ffb4da7325eae53e8acbd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" May 06 17:10:09.161098 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.161073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f6b582af7a45e0c31cfa2ac695736e00-config\") pod \"kube-apiserver-proxy-ip-10-0-135-110.ec2.internal\" (UID: \"f6b582af7a45e0c31cfa2ac695736e00\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal" May 06 17:10:09.250194 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.250169 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:09.261437 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.261420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/69f28b5eb96ffb4da7325eae53e8acbd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal\" (UID: \"69f28b5eb96ffb4da7325eae53e8acbd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" May 06 17:10:09.261483 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.261450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69f28b5eb96ffb4da7325eae53e8acbd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal\" (UID: \"69f28b5eb96ffb4da7325eae53e8acbd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" May 06 17:10:09.261521 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.261468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f6b582af7a45e0c31cfa2ac695736e00-config\") pod \"kube-apiserver-proxy-ip-10-0-135-110.ec2.internal\" (UID: \"f6b582af7a45e0c31cfa2ac695736e00\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal" May 06 17:10:09.261554 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.261518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f6b582af7a45e0c31cfa2ac695736e00-config\") pod \"kube-apiserver-proxy-ip-10-0-135-110.ec2.internal\" (UID: \"f6b582af7a45e0c31cfa2ac695736e00\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal" May 06 17:10:09.261554 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.261521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69f28b5eb96ffb4da7325eae53e8acbd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal\" (UID: \"69f28b5eb96ffb4da7325eae53e8acbd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" May 06 17:10:09.261554 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.261527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/69f28b5eb96ffb4da7325eae53e8acbd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal\" (UID: \"69f28b5eb96ffb4da7325eae53e8acbd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" May 06 17:10:09.350821 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.350805 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:09.386269 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.386254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" May 06 17:10:09.390633 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.390611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal" May 06 17:10:09.451151 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.451119 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:09.551596 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.551569 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:09.652041 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.651992 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:09.752521 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.752493 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:09.774956 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.774935 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" May 06 17:10:09.775477 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.775084 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" May 06 17:10:09.853495 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.853465 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:09.859496 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.859480 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" May 06 17:10:09.872065 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.872047 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" May 06 17:10:09.886844 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.886820 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-05-05 17:05:08 +0000 UTC" deadline="2028-02-18 23:10:50.750906488 +0000 UTC" May 06 17:10:09.886844 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.886841 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15678h0m40.864068784s" May 06 17:10:09.893017 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.892999 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dmrkx" May 06 17:10:09.901030 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.901013 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dmrkx" May 06 17:10:09.916161 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:09.916115 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:09.954476 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:09.954458 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:10.032556 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:10.032528 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69f28b5eb96ffb4da7325eae53e8acbd.slice/crio-cfc23ce9984ad24ae597e35abe0b1eecd5b8548ea431a9799f79b0034a5a448f WatchSource:0}: Error finding container cfc23ce9984ad24ae597e35abe0b1eecd5b8548ea431a9799f79b0034a5a448f: Status 404 returned error can't find the container with id cfc23ce9984ad24ae597e35abe0b1eecd5b8548ea431a9799f79b0034a5a448f May 06 17:10:10.032766 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:10.032743 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6b582af7a45e0c31cfa2ac695736e00.slice/crio-bdffce9d29fc807a996cff97682b3e6a2b080b85ffa954d66f74c324a6398796 WatchSource:0}: Error finding container bdffce9d29fc807a996cff97682b3e6a2b080b85ffa954d66f74c324a6398796: Status 404 returned error can't find the container with id bdffce9d29fc807a996cff97682b3e6a2b080b85ffa954d66f74c324a6398796 May 06 17:10:10.040050 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.040034 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 06 17:10:10.054697 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:10.054675 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:10.155221 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:10.155201 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-110.ec2.internal\" not found" May 06 17:10:10.197756 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.197717 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:10.234537 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.234521 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:10.260158 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.260139 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" May 06 17:10:10.279557 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.279539 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 06 17:10:10.280391 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.280381 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal" May 06 17:10:10.293418 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.293405 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 06 17:10:10.840616 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.840588 2576 apiserver.go:52] "Watching apiserver" May 06 17:10:10.846878 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.846858 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" May 06 17:10:10.847358 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.847335 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-lq7gr","openshift-cluster-node-tuning-operator/tuned-xx97c","openshift-image-registry/node-ca-85dtq","openshift-multus/multus-additional-cni-plugins-7wfpq","openshift-ovn-kubernetes/ovnkube-node-bbxnx","kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal","openshift-multus/multus-ghkv2","openshift-multus/network-metrics-daemon-mvgqp","openshift-network-diagnostics/network-check-target-qht56","openshift-network-operator/iptables-alerter-jsm9m"] May 06 17:10:10.849340 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.849321 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:10.851494 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.851476 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.851774 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.851751 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-pn98x\"" May 06 17:10:10.852021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.852004 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" May 06 17:10:10.852203 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.852188 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" May 06 17:10:10.852550 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.852534 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:10.854843 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.854824 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2twxn\"" May 06 17:10:10.855051 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.855035 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" May 06 17:10:10.855260 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.855215 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" May 06 17:10:10.855585 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.855566 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" May 06 17:10:10.855656 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.855638 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" May 06 17:10:10.855752 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.855738 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" May 06 17:10:10.855935 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.855921 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wnmdq\"" May 06 17:10:10.856637 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.856619 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.857292 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.857276 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.858037 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.858021 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.859806 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.859415 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" May 06 17:10:10.859806 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.859647 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" May 06 17:10:10.859806 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.859751 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" May 06 17:10:10.859971 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.859850 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" May 06 17:10:10.859971 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.859901 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" May 06 17:10:10.860071 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.860033 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" May 06 17:10:10.860119 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.860109 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" May 06 17:10:10.860530 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.860192 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" May 06 17:10:10.860530 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.860375 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mdbrn\"" May 06 17:10:10.861171 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.861030 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h7f27\"" May 06 17:10:10.861898 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.861721 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" May 06 17:10:10.862009 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.861899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" May 06 17:10:10.862082 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.862039 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" May 06 17:10:10.862205 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.862190 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" May 06 17:10:10.862379 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.862365 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" May 06 17:10:10.862865 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.862843 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" May 06 17:10:10.863044 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.863024 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-dfgww\"" May 06 17:10:10.863837 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.863629 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:10.863837 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:10.863694 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:10.863837 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.863835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:10.864823 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:10.864796 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:10.864909 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.864890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ghkv2" May 06 17:10:10.865413 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.865396 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:10.867696 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.867669 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" May 06 17:10:10.867995 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.867977 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xf56v\"" May 06 17:10:10.868260 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868243 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" May 06 17:10:10.868359 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868342 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-wz2vr\"" May 06 17:10:10.868528 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868499 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-os-release\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.868624 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-slash\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.868624 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-log-socket\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.868766 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868634 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" May 06 17:10:10.868766 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa50d981-80fc-4dbd-83a3-f8f9cef34743-cni-binary-copy\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.868766 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868673 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6286127c-14fd-44e0-9034-230ab16d2f46-konnectivity-ca\") pod \"konnectivity-agent-lq7gr\" (UID: \"6286127c-14fd-44e0-9034-230ab16d2f46\") " pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:10.868766 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868701 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-system-cni-dir\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.868766 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-node-log\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.868766 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-var-lib-kubelet\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.869082 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4870cbd6-d111-4dd5-b84d-b7abb6469f33-ovn-node-metrics-cert\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.869082 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-hostroot\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.869082 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-etc-kubernetes\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.869082 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.868918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:10.869082 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.869082 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e4f7af8a-6313-4c92-9c2a-385f8580c399-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.869082 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-var-lib-openvswitch\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.869424 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-run-openvswitch\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.869424 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-registration-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.869424 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-var-lib-cni-multus\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.869424 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-cnibin\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.869424 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-run-ovn\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.869916 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.869979 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4870cbd6-d111-4dd5-b84d-b7abb6469f33-ovnkube-script-lib\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.870050 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.869979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-sysconfig\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.870050 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ed404eb-a555-4ae7-b728-791f9d60c831-serviceca\") pod \"node-ca-85dtq\" (UID: \"9ed404eb-a555-4ae7-b728-791f9d60c831\") " pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:10.870159 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-var-lib-cni-bin\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.870243 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb7ts\" (UniqueName: \"kubernetes.io/projected/e4f7af8a-6313-4c92-9c2a-385f8580c399-kube-api-access-hb7ts\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.870243 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-etc-openvswitch\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.870382 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4870cbd6-d111-4dd5-b84d-b7abb6469f33-ovnkube-config\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.870382 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5r5j\" (UniqueName: \"kubernetes.io/projected/4870cbd6-d111-4dd5-b84d-b7abb6469f33-kube-api-access-s5r5j\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.870486 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-cnibin\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.870486 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg764\" (UniqueName: \"kubernetes.io/projected/2c28b880-a50d-4878-bf4e-20dc0f464cc2-kube-api-access-hg764\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:10.870486 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-run-systemd\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.870748 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-tuned\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.870905 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.870891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ed404eb-a555-4ae7-b728-791f9d60c831-host\") pod \"node-ca-85dtq\" (UID: \"9ed404eb-a555-4ae7-b728-791f9d60c831\") " pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:10.871053 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-run-multus-certs\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.871115 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-run-ovn-kubernetes\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.871115 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-cni-netd\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.871209 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-kubernetes\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.871209 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-sysctl-d\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.871209 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871151 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-systemd\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.871209 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871167 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-system-cni-dir\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.871419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871215 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d074d7a7-13e7-4b23-a0fc-9523795f60e1-tmp\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.871419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-run-netns\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.871419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-var-lib-kubelet\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.871419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e4f7af8a-6313-4c92-9c2a-385f8580c399-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.871610 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-run-netns\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.871610 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-cni-bin\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.871610 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871485 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" May 06 17:10:10.871610 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-device-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.871610 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-sys\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.871610 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-daemon-config\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.871942 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-etc-selinux\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.871942 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-socket-dir-parent\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.871942 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2w74\" (UniqueName: \"kubernetes.io/projected/643a9363-15b5-4077-948f-22eacf68dede-kube-api-access-l2w74\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.871942 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-run\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.871942 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-conf-dir\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.871942 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.871942 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-socket-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.871942 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-modprobe-d\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.871942 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-host\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.872335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.871967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x85mb\" (UniqueName: \"kubernetes.io/projected/9ed404eb-a555-4ae7-b728-791f9d60c831-kube-api-access-x85mb\") pod \"node-ca-85dtq\" (UID: \"9ed404eb-a555-4ae7-b728-791f9d60c831\") " pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:10.872335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwlkh\" (UniqueName: \"kubernetes.io/projected/fa50d981-80fc-4dbd-83a3-f8f9cef34743-kube-api-access-gwlkh\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.872335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-sysctl-conf\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.872335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxg9q\" (UniqueName: \"kubernetes.io/projected/d074d7a7-13e7-4b23-a0fc-9523795f60e1-kube-api-access-bxg9q\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.872335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-cni-dir\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.872335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e4f7af8a-6313-4c92-9c2a-385f8580c399-cni-binary-copy\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.872335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872333 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4870cbd6-d111-4dd5-b84d-b7abb6469f33-env-overrides\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.872646 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6286127c-14fd-44e0-9034-230ab16d2f46-agent-certs\") pod \"konnectivity-agent-lq7gr\" (UID: \"6286127c-14fd-44e0-9034-230ab16d2f46\") " pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:10.872646 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-kubelet\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.872646 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-systemd-units\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.872646 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-sys-fs\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.872646 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-lib-modules\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.872646 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-os-release\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.872646 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.872549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-run-k8s-cni-cncf-io\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.902726 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.902599 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-05-05 17:05:09 +0000 UTC" deadline="2028-01-16 07:27:29.84916927 +0000 UTC" May 06 17:10:10.902726 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.902626 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14870h17m18.946546782s" May 06 17:10:10.946741 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.946587 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" May 06 17:10:10.952730 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.952683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal" event={"ID":"f6b582af7a45e0c31cfa2ac695736e00","Type":"ContainerStarted","Data":"bdffce9d29fc807a996cff97682b3e6a2b080b85ffa954d66f74c324a6398796"} May 06 17:10:10.954649 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.954624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" event={"ID":"69f28b5eb96ffb4da7325eae53e8acbd","Type":"ContainerStarted","Data":"cfc23ce9984ad24ae597e35abe0b1eecd5b8548ea431a9799f79b0034a5a448f"} May 06 17:10:10.961675 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.961655 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 06 17:10:10.973419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6286127c-14fd-44e0-9034-230ab16d2f46-agent-certs\") pod \"konnectivity-agent-lq7gr\" (UID: \"6286127c-14fd-44e0-9034-230ab16d2f46\") " pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:10.973419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-kubelet\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.973419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-systemd-units\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.973419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-sys-fs\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.973419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-lib-modules\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.973419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-kubelet\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.973419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-os-release\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.973419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-systemd-units\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-run-k8s-cni-cncf-io\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-lib-modules\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-os-release\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-slash\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-log-socket\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-sys-fs\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-run-k8s-cni-cncf-io\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-slash\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-os-release\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa50d981-80fc-4dbd-83a3-f8f9cef34743-cni-binary-copy\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-log-socket\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbrbj\" (UniqueName: \"kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj\") pod \"network-check-target-qht56\" (UID: \"0d02e3af-24fa-4677-818d-3668647af67f\") " pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6286127c-14fd-44e0-9034-230ab16d2f46-konnectivity-ca\") pod \"konnectivity-agent-lq7gr\" (UID: \"6286127c-14fd-44e0-9034-230ab16d2f46\") " pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-system-cni-dir\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-node-log\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973737 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973777 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-var-lib-kubelet\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.973833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4870cbd6-d111-4dd5-b84d-b7abb6469f33-ovn-node-metrics-cert\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-hostroot\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-etc-kubernetes\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e4f7af8a-6313-4c92-9c2a-385f8580c399-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-var-lib-openvswitch\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-run-openvswitch\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-registration-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-var-lib-cni-multus\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-cnibin\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-run-ovn\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fa50d981-80fc-4dbd-83a3-f8f9cef34743-cni-binary-copy\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-system-cni-dir\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974205 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-hostroot\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.973644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-os-release\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.974640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974265 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4870cbd6-d111-4dd5-b84d-b7abb6469f33-ovnkube-script-lib\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974313 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-sysconfig\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-node-log\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ed404eb-a555-4ae7-b728-791f9d60c831-serviceca\") pod \"node-ca-85dtq\" (UID: \"9ed404eb-a555-4ae7-b728-791f9d60c831\") " pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-var-lib-openvswitch\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-var-lib-kubelet\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-var-lib-cni-bin\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-var-lib-cni-multus\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-etc-kubernetes\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:10.974545 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-run-openvswitch\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-cnibin\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:10.974636 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs podName:2c28b880-a50d-4878-bf4e-20dc0f464cc2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:11.474592113 +0000 UTC m=+3.037747199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs") pod "network-metrics-daemon-mvgqp" (UID: "2c28b880-a50d-4878-bf4e-20dc0f464cc2") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-run-ovn\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-sysconfig\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6286127c-14fd-44e0-9034-230ab16d2f46-konnectivity-ca\") pod \"konnectivity-agent-lq7gr\" (UID: \"6286127c-14fd-44e0-9034-230ab16d2f46\") " pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:10.975422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e4f7af8a-6313-4c92-9c2a-385f8580c399-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-registration-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-var-lib-cni-bin\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hb7ts\" (UniqueName: \"kubernetes.io/projected/e4f7af8a-6313-4c92-9c2a-385f8580c399-kube-api-access-hb7ts\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-etc-openvswitch\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4870cbd6-d111-4dd5-b84d-b7abb6469f33-ovnkube-config\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5r5j\" (UniqueName: \"kubernetes.io/projected/4870cbd6-d111-4dd5-b84d-b7abb6469f33-kube-api-access-s5r5j\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-cnibin\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hg764\" (UniqueName: \"kubernetes.io/projected/2c28b880-a50d-4878-bf4e-20dc0f464cc2-kube-api-access-hg764\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-run-systemd\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-tuned\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974925 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4870cbd6-d111-4dd5-b84d-b7abb6469f33-ovnkube-script-lib\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ed404eb-a555-4ae7-b728-791f9d60c831-host\") pod \"node-ca-85dtq\" (UID: \"9ed404eb-a555-4ae7-b728-791f9d60c831\") " pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-run-multus-certs\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974984 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4afd1a68-be62-4fa2-9876-997fdda7a250-host-slash\") pod \"iptables-alerter-jsm9m\" (UID: \"4afd1a68-be62-4fa2-9876-997fdda7a250\") " pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.974999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-run-ovn-kubernetes\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-cni-netd\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.976176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-kubernetes\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-sysctl-d\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-systemd\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-system-cni-dir\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d074d7a7-13e7-4b23-a0fc-9523795f60e1-tmp\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-run-netns\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-var-lib-kubelet\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975143 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ed404eb-a555-4ae7-b728-791f9d60c831-host\") pod \"node-ca-85dtq\" (UID: \"9ed404eb-a555-4ae7-b728-791f9d60c831\") " pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e4f7af8a-6313-4c92-9c2a-385f8580c399-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-run-netns\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-cni-bin\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-device-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-sys\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-daemon-config\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-etc-selinux\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-socket-dir-parent\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2w74\" (UniqueName: \"kubernetes.io/projected/643a9363-15b5-4077-948f-22eacf68dede-kube-api-access-l2w74\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.976940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-run\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-conf-dir\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-socket-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-modprobe-d\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-etc-openvswitch\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-host\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-host\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x85mb\" (UniqueName: \"kubernetes.io/projected/9ed404eb-a555-4ae7-b728-791f9d60c831-kube-api-access-x85mb\") pod \"node-ca-85dtq\" (UID: \"9ed404eb-a555-4ae7-b728-791f9d60c831\") " pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-run-multus-certs\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9ed404eb-a555-4ae7-b728-791f9d60c831-serviceca\") pod \"node-ca-85dtq\" (UID: \"9ed404eb-a555-4ae7-b728-791f9d60c831\") " pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-cni-bin\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-device-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-run-ovn-kubernetes\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975666 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-sys\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-cni-netd\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-kubernetes\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975840 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-sysctl-d\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.977712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-systemd\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4870cbd6-d111-4dd5-b84d-b7abb6469f33-ovnkube-config\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.975933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-system-cni-dir\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-daemon-config\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e4f7af8a-6313-4c92-9c2a-385f8580c399-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976130 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-cnibin\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-run-systemd\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976182 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-etc-selinux\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976265 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-socket-dir-parent\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-var-lib-kubelet\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-host-run-netns\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-run\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4870cbd6-d111-4dd5-b84d-b7abb6469f33-host-run-netns\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-conf-dir\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/643a9363-15b5-4077-948f-22eacf68dede-socket-dir\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976575 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-modprobe-d\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwlkh\" (UniqueName: \"kubernetes.io/projected/fa50d981-80fc-4dbd-83a3-f8f9cef34743-kube-api-access-gwlkh\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.978551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-sysctl-conf\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxg9q\" (UniqueName: \"kubernetes.io/projected/d074d7a7-13e7-4b23-a0fc-9523795f60e1-kube-api-access-bxg9q\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976645 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-cni-dir\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4afd1a68-be62-4fa2-9876-997fdda7a250-iptables-alerter-script\") pod \"iptables-alerter-jsm9m\" (UID: \"4afd1a68-be62-4fa2-9876-997fdda7a250\") " pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e4f7af8a-6313-4c92-9c2a-385f8580c399-cni-binary-copy\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4870cbd6-d111-4dd5-b84d-b7abb6469f33-env-overrides\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrjr\" (UniqueName: \"kubernetes.io/projected/4afd1a68-be62-4fa2-9876-997fdda7a250-kube-api-access-qxrjr\") pod \"iptables-alerter-jsm9m\" (UID: \"4afd1a68-be62-4fa2-9876-997fdda7a250\") " pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e4f7af8a-6313-4c92-9c2a-385f8580c399-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fa50d981-80fc-4dbd-83a3-f8f9cef34743-multus-cni-dir\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.976885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-sysctl-conf\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.977202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e4f7af8a-6313-4c92-9c2a-385f8580c399-cni-binary-copy\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.977312 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4870cbd6-d111-4dd5-b84d-b7abb6469f33-env-overrides\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.978728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d074d7a7-13e7-4b23-a0fc-9523795f60e1-etc-tuned\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.979064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d074d7a7-13e7-4b23-a0fc-9523795f60e1-tmp\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.980074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.979789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6286127c-14fd-44e0-9034-230ab16d2f46-agent-certs\") pod \"konnectivity-agent-lq7gr\" (UID: \"6286127c-14fd-44e0-9034-230ab16d2f46\") " pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:10.981931 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.981913 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4870cbd6-d111-4dd5-b84d-b7abb6469f33-ovn-node-metrics-cert\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.989308 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.989285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2w74\" (UniqueName: \"kubernetes.io/projected/643a9363-15b5-4077-948f-22eacf68dede-kube-api-access-l2w74\") pod \"aws-ebs-csi-driver-node-bjfnr\" (UID: \"643a9363-15b5-4077-948f-22eacf68dede\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:10.992616 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.992566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5r5j\" (UniqueName: \"kubernetes.io/projected/4870cbd6-d111-4dd5-b84d-b7abb6469f33-kube-api-access-s5r5j\") pod \"ovnkube-node-bbxnx\" (UID: \"4870cbd6-d111-4dd5-b84d-b7abb6469f33\") " pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:10.992702 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.992676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxg9q\" (UniqueName: \"kubernetes.io/projected/d074d7a7-13e7-4b23-a0fc-9523795f60e1-kube-api-access-bxg9q\") pod \"tuned-xx97c\" (UID: \"d074d7a7-13e7-4b23-a0fc-9523795f60e1\") " pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:10.994998 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.993914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x85mb\" (UniqueName: \"kubernetes.io/projected/9ed404eb-a555-4ae7-b728-791f9d60c831-kube-api-access-x85mb\") pod \"node-ca-85dtq\" (UID: \"9ed404eb-a555-4ae7-b728-791f9d60c831\") " pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:10.994998 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.994281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwlkh\" (UniqueName: \"kubernetes.io/projected/fa50d981-80fc-4dbd-83a3-f8f9cef34743-kube-api-access-gwlkh\") pod \"multus-ghkv2\" (UID: \"fa50d981-80fc-4dbd-83a3-f8f9cef34743\") " pod="openshift-multus/multus-ghkv2" May 06 17:10:10.996205 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.996170 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb7ts\" (UniqueName: \"kubernetes.io/projected/e4f7af8a-6313-4c92-9c2a-385f8580c399-kube-api-access-hb7ts\") pod \"multus-additional-cni-plugins-7wfpq\" (UID: \"e4f7af8a-6313-4c92-9c2a-385f8580c399\") " pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:10.997556 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:10.997534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg764\" (UniqueName: \"kubernetes.io/projected/2c28b880-a50d-4878-bf4e-20dc0f464cc2-kube-api-access-hg764\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:11.077243 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.077202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4afd1a68-be62-4fa2-9876-997fdda7a250-host-slash\") pod \"iptables-alerter-jsm9m\" (UID: \"4afd1a68-be62-4fa2-9876-997fdda7a250\") " pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:11.077348 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.077272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4afd1a68-be62-4fa2-9876-997fdda7a250-iptables-alerter-script\") pod \"iptables-alerter-jsm9m\" (UID: \"4afd1a68-be62-4fa2-9876-997fdda7a250\") " pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:11.077348 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.077295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrjr\" (UniqueName: \"kubernetes.io/projected/4afd1a68-be62-4fa2-9876-997fdda7a250-kube-api-access-qxrjr\") pod \"iptables-alerter-jsm9m\" (UID: \"4afd1a68-be62-4fa2-9876-997fdda7a250\") " pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:11.077348 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.077304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4afd1a68-be62-4fa2-9876-997fdda7a250-host-slash\") pod \"iptables-alerter-jsm9m\" (UID: \"4afd1a68-be62-4fa2-9876-997fdda7a250\") " pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:11.077348 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.077318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrbj\" (UniqueName: \"kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj\") pod \"network-check-target-qht56\" (UID: \"0d02e3af-24fa-4677-818d-3668647af67f\") " pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:11.077849 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.077828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4afd1a68-be62-4fa2-9876-997fdda7a250-iptables-alerter-script\") pod \"iptables-alerter-jsm9m\" (UID: \"4afd1a68-be62-4fa2-9876-997fdda7a250\") " pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:11.084612 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:11.084583 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:11.084612 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:11.084607 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:11.084761 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:11.084620 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cbrbj for pod openshift-network-diagnostics/network-check-target-qht56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:11.084761 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:11.084692 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj podName:0d02e3af-24fa-4677-818d-3668647af67f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:11.584675049 +0000 UTC m=+3.147830120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cbrbj" (UniqueName: "kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj") pod "network-check-target-qht56" (UID: "0d02e3af-24fa-4677-818d-3668647af67f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:11.086989 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.086966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrjr\" (UniqueName: \"kubernetes.io/projected/4afd1a68-be62-4fa2-9876-997fdda7a250-kube-api-access-qxrjr\") pod \"iptables-alerter-jsm9m\" (UID: \"4afd1a68-be62-4fa2-9876-997fdda7a250\") " pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:11.166473 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.166412 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:11.175988 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.175968 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xx97c" May 06 17:10:11.185630 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.185612 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-85dtq" May 06 17:10:11.192199 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.192183 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:11.199745 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.199728 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" May 06 17:10:11.207276 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.207258 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" May 06 17:10:11.213803 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.213784 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ghkv2" May 06 17:10:11.221292 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.221276 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jsm9m" May 06 17:10:11.478887 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.478806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:11.479025 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:11.478947 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:11.479025 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:11.479008 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs podName:2c28b880-a50d-4878-bf4e-20dc0f464cc2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:12.478990221 +0000 UTC m=+4.042145296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs") pod "network-metrics-daemon-mvgqp" (UID: "2c28b880-a50d-4878-bf4e-20dc0f464cc2") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:11.680478 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.680435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrbj\" (UniqueName: \"kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj\") pod \"network-check-target-qht56\" (UID: \"0d02e3af-24fa-4677-818d-3668647af67f\") " pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:11.680676 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:11.680577 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:11.680676 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:11.680593 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:11.680676 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:11.680601 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cbrbj for pod openshift-network-diagnostics/network-check-target-qht56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:11.680676 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:11.680655 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj podName:0d02e3af-24fa-4677-818d-3668647af67f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:12.680639446 +0000 UTC m=+4.243794519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbrbj" (UniqueName: "kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj") pod "network-check-target-qht56" (UID: "0d02e3af-24fa-4677-818d-3668647af67f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:11.709541 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:11.709515 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa50d981_80fc_4dbd_83a3_f8f9cef34743.slice/crio-6cec54cc20b5d0b6d2528f81d99eaffcffa3a1af7f26e9211449777595e4877a WatchSource:0}: Error finding container 6cec54cc20b5d0b6d2528f81d99eaffcffa3a1af7f26e9211449777595e4877a: Status 404 returned error can't find the container with id 6cec54cc20b5d0b6d2528f81d99eaffcffa3a1af7f26e9211449777595e4877a May 06 17:10:11.721187 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:11.721159 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4870cbd6_d111_4dd5_b84d_b7abb6469f33.slice/crio-f6794ccdfa36eaab422a2a939fda4fd21cc8d30d94dd52e79ef8d0ef7de81fa1 WatchSource:0}: Error finding container f6794ccdfa36eaab422a2a939fda4fd21cc8d30d94dd52e79ef8d0ef7de81fa1: Status 404 returned error can't find the container with id f6794ccdfa36eaab422a2a939fda4fd21cc8d30d94dd52e79ef8d0ef7de81fa1 May 06 17:10:11.722985 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:11.722963 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4f7af8a_6313_4c92_9c2a_385f8580c399.slice/crio-d44f600c17a58d410f89174bba2a9f4bfed4a581edf3c23631f7d51427ab29b2 WatchSource:0}: Error finding container d44f600c17a58d410f89174bba2a9f4bfed4a581edf3c23631f7d51427ab29b2: Status 404 returned error can't find the container with id d44f600c17a58d410f89174bba2a9f4bfed4a581edf3c23631f7d51427ab29b2 May 06 17:10:11.726373 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:11.726348 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd074d7a7_13e7_4b23_a0fc_9523795f60e1.slice/crio-e489fb8b6fad340161ad294f9e5bf3065879e715998ca1d8e4724fba68eb2971 WatchSource:0}: Error finding container e489fb8b6fad340161ad294f9e5bf3065879e715998ca1d8e4724fba68eb2971: Status 404 returned error can't find the container with id e489fb8b6fad340161ad294f9e5bf3065879e715998ca1d8e4724fba68eb2971 May 06 17:10:11.727260 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:11.727206 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod643a9363_15b5_4077_948f_22eacf68dede.slice/crio-28f3ca7331f1ae503dcfec9a718d713189fd822583f6bc24f81bb51ba6904830 WatchSource:0}: Error finding container 28f3ca7331f1ae503dcfec9a718d713189fd822583f6bc24f81bb51ba6904830: Status 404 returned error can't find the container with id 28f3ca7331f1ae503dcfec9a718d713189fd822583f6bc24f81bb51ba6904830 May 06 17:10:11.727785 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:11.727757 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6286127c_14fd_44e0_9034_230ab16d2f46.slice/crio-e433f32f028b3a7c027e6c301580d6e19d6c3956f7c4370545be3b5313b9adda WatchSource:0}: Error finding container e433f32f028b3a7c027e6c301580d6e19d6c3956f7c4370545be3b5313b9adda: Status 404 returned error can't find the container with id e433f32f028b3a7c027e6c301580d6e19d6c3956f7c4370545be3b5313b9adda May 06 17:10:11.729011 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:11.728949 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4afd1a68_be62_4fa2_9876_997fdda7a250.slice/crio-8b293f313d35d275b781df9690c406f558bccac87a048ce5f373f630c72dccbb WatchSource:0}: Error finding container 8b293f313d35d275b781df9690c406f558bccac87a048ce5f373f630c72dccbb: Status 404 returned error can't find the container with id 8b293f313d35d275b781df9690c406f558bccac87a048ce5f373f630c72dccbb May 06 17:10:11.730118 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:11.730103 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed404eb_a555_4ae7_b728_791f9d60c831.slice/crio-ef1a6f8d391808eaa49a1d4374d38f9004c5532530a5ef6e4729914aeacccec5 WatchSource:0}: Error finding container ef1a6f8d391808eaa49a1d4374d38f9004c5532530a5ef6e4729914aeacccec5: Status 404 returned error can't find the container with id ef1a6f8d391808eaa49a1d4374d38f9004c5532530a5ef6e4729914aeacccec5 May 06 17:10:11.903878 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.903694 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-05-05 17:05:09 +0000 UTC" deadline="2027-10-15 23:49:55.876985009 +0000 UTC" May 06 17:10:11.903878 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.903867 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12654h39m43.973122127s" May 06 17:10:11.956524 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.956487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ghkv2" event={"ID":"fa50d981-80fc-4dbd-83a3-f8f9cef34743","Type":"ContainerStarted","Data":"6cec54cc20b5d0b6d2528f81d99eaffcffa3a1af7f26e9211449777595e4877a"} May 06 17:10:11.957403 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.957381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jsm9m" event={"ID":"4afd1a68-be62-4fa2-9876-997fdda7a250","Type":"ContainerStarted","Data":"8b293f313d35d275b781df9690c406f558bccac87a048ce5f373f630c72dccbb"} May 06 17:10:11.958251 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.958220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xx97c" event={"ID":"d074d7a7-13e7-4b23-a0fc-9523795f60e1","Type":"ContainerStarted","Data":"e489fb8b6fad340161ad294f9e5bf3065879e715998ca1d8e4724fba68eb2971"} May 06 17:10:11.959031 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.959013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" event={"ID":"e4f7af8a-6313-4c92-9c2a-385f8580c399","Type":"ContainerStarted","Data":"d44f600c17a58d410f89174bba2a9f4bfed4a581edf3c23631f7d51427ab29b2"} May 06 17:10:11.959845 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.959827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" event={"ID":"4870cbd6-d111-4dd5-b84d-b7abb6469f33","Type":"ContainerStarted","Data":"f6794ccdfa36eaab422a2a939fda4fd21cc8d30d94dd52e79ef8d0ef7de81fa1"} May 06 17:10:11.961036 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.961018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal" event={"ID":"f6b582af7a45e0c31cfa2ac695736e00","Type":"ContainerStarted","Data":"3091279cb4641127b8dbe0f5e52fd24a47728576e177f353e88c3bc88b06e4a7"} May 06 17:10:11.961812 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.961795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-85dtq" event={"ID":"9ed404eb-a555-4ae7-b728-791f9d60c831","Type":"ContainerStarted","Data":"ef1a6f8d391808eaa49a1d4374d38f9004c5532530a5ef6e4729914aeacccec5"} May 06 17:10:11.962515 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.962497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lq7gr" event={"ID":"6286127c-14fd-44e0-9034-230ab16d2f46","Type":"ContainerStarted","Data":"e433f32f028b3a7c027e6c301580d6e19d6c3956f7c4370545be3b5313b9adda"} May 06 17:10:11.963240 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.963210 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" event={"ID":"643a9363-15b5-4077-948f-22eacf68dede","Type":"ContainerStarted","Data":"28f3ca7331f1ae503dcfec9a718d713189fd822583f6bc24f81bb51ba6904830"} May 06 17:10:11.974430 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:11.974395 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-110.ec2.internal" podStartSLOduration=1.974385897 podStartE2EDuration="1.974385897s" podCreationTimestamp="2026-05-06 17:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:10:11.974131808 +0000 UTC m=+3.537286901" watchObservedRunningTime="2026-05-06 17:10:11.974385897 +0000 UTC m=+3.537540987" May 06 17:10:12.485889 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:12.485853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:12.486069 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:12.486034 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:12.486143 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:12.486099 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs podName:2c28b880-a50d-4878-bf4e-20dc0f464cc2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:14.486081434 +0000 UTC m=+6.049236507 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs") pod "network-metrics-daemon-mvgqp" (UID: "2c28b880-a50d-4878-bf4e-20dc0f464cc2") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:12.687933 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:12.687895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrbj\" (UniqueName: \"kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj\") pod \"network-check-target-qht56\" (UID: \"0d02e3af-24fa-4677-818d-3668647af67f\") " pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:12.688122 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:12.688062 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:12.688122 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:12.688081 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:12.688122 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:12.688091 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cbrbj for pod openshift-network-diagnostics/network-check-target-qht56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:12.688292 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:12.688145 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj podName:0d02e3af-24fa-4677-818d-3668647af67f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:14.688127727 +0000 UTC m=+6.251282796 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbrbj" (UniqueName: "kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj") pod "network-check-target-qht56" (UID: "0d02e3af-24fa-4677-818d-3668647af67f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:12.946693 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:12.945980 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:12.946693 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:12.946120 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:12.946693 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:12.946547 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:12.946693 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:12.946650 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:12.974671 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:12.974356 2576 generic.go:358] "Generic (PLEG): container finished" podID="69f28b5eb96ffb4da7325eae53e8acbd" containerID="ed153a439d84d9bdde4701ebd17a93d5594c21b2d5a8b8ededef7e55b96f3069" exitCode=0 May 06 17:10:12.974671 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:12.974471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" event={"ID":"69f28b5eb96ffb4da7325eae53e8acbd","Type":"ContainerDied","Data":"ed153a439d84d9bdde4701ebd17a93d5594c21b2d5a8b8ededef7e55b96f3069"} May 06 17:10:13.984161 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:13.983524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" event={"ID":"69f28b5eb96ffb4da7325eae53e8acbd","Type":"ContainerStarted","Data":"a8cf5c274dae28704703b904e00aaaad8f0a2716b522fe7873d116b6abcb83ea"} May 06 17:10:14.505891 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:14.505853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:14.506046 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:14.506001 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:14.506090 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:14.506069 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs podName:2c28b880-a50d-4878-bf4e-20dc0f464cc2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:18.506050011 +0000 UTC m=+10.069205083 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs") pod "network-metrics-daemon-mvgqp" (UID: "2c28b880-a50d-4878-bf4e-20dc0f464cc2") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:14.707588 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:14.707555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrbj\" (UniqueName: \"kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj\") pod \"network-check-target-qht56\" (UID: \"0d02e3af-24fa-4677-818d-3668647af67f\") " pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:14.707776 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:14.707724 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:14.707776 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:14.707741 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:14.707776 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:14.707753 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cbrbj for pod openshift-network-diagnostics/network-check-target-qht56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:14.707924 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:14.707806 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj podName:0d02e3af-24fa-4677-818d-3668647af67f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:18.707788988 +0000 UTC m=+10.270944058 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbrbj" (UniqueName: "kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj") pod "network-check-target-qht56" (UID: "0d02e3af-24fa-4677-818d-3668647af67f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:14.948092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:14.946335 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:14.948092 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:14.946465 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:14.948092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:14.947410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:14.948092 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:14.947534 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:16.946053 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:16.946018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:16.946485 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:16.946017 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:16.946485 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:16.946166 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:16.946485 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:16.946245 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:18.539861 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:18.539828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:18.540216 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:18.539992 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:18.540216 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:18.540061 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs podName:2c28b880-a50d-4878-bf4e-20dc0f464cc2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:26.54004035 +0000 UTC m=+18.103195424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs") pod "network-metrics-daemon-mvgqp" (UID: "2c28b880-a50d-4878-bf4e-20dc0f464cc2") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:18.741969 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:18.741930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrbj\" (UniqueName: \"kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj\") pod \"network-check-target-qht56\" (UID: \"0d02e3af-24fa-4677-818d-3668647af67f\") " pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:18.742165 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:18.742143 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:18.742223 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:18.742171 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:18.742223 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:18.742186 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cbrbj for pod openshift-network-diagnostics/network-check-target-qht56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:18.742344 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:18.742266 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj podName:0d02e3af-24fa-4677-818d-3668647af67f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:26.742245951 +0000 UTC m=+18.305401037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbrbj" (UniqueName: "kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj") pod "network-check-target-qht56" (UID: "0d02e3af-24fa-4677-818d-3668647af67f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:18.947105 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:18.946494 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:18.947105 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:18.946611 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:18.947105 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:18.946931 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:18.947105 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:18.947017 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:20.945897 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:20.945815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:20.946357 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:20.945815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:20.946357 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:20.945939 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:20.946357 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:20.946065 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:22.948260 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:22.948218 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:22.948260 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:22.948264 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:22.948727 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:22.948336 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:22.948727 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:22.948474 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:24.946545 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:24.946513 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:24.946977 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:24.946514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:24.946977 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:24.946632 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:24.946977 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:24.946684 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:26.597162 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:26.597117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:26.597619 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:26.597299 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:26.597619 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:26.597378 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs podName:2c28b880-a50d-4878-bf4e-20dc0f464cc2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.597357013 +0000 UTC m=+34.160512087 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs") pod "network-metrics-daemon-mvgqp" (UID: "2c28b880-a50d-4878-bf4e-20dc0f464cc2") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:26.799216 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:26.799180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrbj\" (UniqueName: \"kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj\") pod \"network-check-target-qht56\" (UID: \"0d02e3af-24fa-4677-818d-3668647af67f\") " pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:26.799391 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:26.799358 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 06 17:10:26.799391 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:26.799381 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 06 17:10:26.799391 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:26.799393 2576 projected.go:194] Error preparing data for projected volume kube-api-access-cbrbj for pod openshift-network-diagnostics/network-check-target-qht56: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:26.799520 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:26.799454 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj podName:0d02e3af-24fa-4677-818d-3668647af67f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.799433838 +0000 UTC m=+34.362588911 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cbrbj" (UniqueName: "kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj") pod "network-check-target-qht56" (UID: "0d02e3af-24fa-4677-818d-3668647af67f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 06 17:10:26.946673 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:26.946599 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:26.946833 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:26.946697 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:26.946833 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:26.946801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:26.946942 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:26.946917 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:27.097081 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.097031 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-110.ec2.internal" podStartSLOduration=17.0970134 podStartE2EDuration="17.0970134s" podCreationTimestamp="2026-05-06 17:10:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:10:13.997887684 +0000 UTC m=+5.561042776" watchObservedRunningTime="2026-05-06 17:10:27.0970134 +0000 UTC m=+18.660168491" May 06 17:10:27.097648 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.097631 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-lsdnk"] May 06 17:10:27.234991 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.234916 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:27.235154 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:27.234992 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lsdnk" podUID="04fcdaac-b196-4dea-a077-864b3ee42652" May 06 17:10:27.302804 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.302774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:27.302945 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.302832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/04fcdaac-b196-4dea-a077-864b3ee42652-kubelet-config\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:27.302945 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.302881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/04fcdaac-b196-4dea-a077-864b3ee42652-dbus\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:27.403401 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.403370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:27.403401 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.403416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/04fcdaac-b196-4dea-a077-864b3ee42652-kubelet-config\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:27.403634 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.403440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/04fcdaac-b196-4dea-a077-864b3ee42652-dbus\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:27.403634 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.403529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/04fcdaac-b196-4dea-a077-864b3ee42652-kubelet-config\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:27.403634 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:27.403541 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:27.403634 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.403566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/04fcdaac-b196-4dea-a077-864b3ee42652-dbus\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:27.403634 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:27.403596 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret podName:04fcdaac-b196-4dea-a077-864b3ee42652 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:27.903582642 +0000 UTC m=+19.466737711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret") pod "global-pull-secret-syncer-lsdnk" (UID: "04fcdaac-b196-4dea-a077-864b3ee42652") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:27.906474 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:27.906438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:27.906905 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:27.906587 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:27.906905 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:27.906651 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret podName:04fcdaac-b196-4dea-a077-864b3ee42652 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:28.906635543 +0000 UTC m=+20.469790617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret") pod "global-pull-secret-syncer-lsdnk" (UID: "04fcdaac-b196-4dea-a077-864b3ee42652") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:28.914093 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:28.913862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:28.914547 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:28.913998 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:28.914547 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:28.914198 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret podName:04fcdaac-b196-4dea-a077-864b3ee42652 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:30.914180774 +0000 UTC m=+22.477335848 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret") pod "global-pull-secret-syncer-lsdnk" (UID: "04fcdaac-b196-4dea-a077-864b3ee42652") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:28.946552 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:28.946511 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:28.946701 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:28.946611 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:28.946759 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:28.946693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:28.946821 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:28.946807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:28.946868 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:28.946830 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:28.946933 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:28.946886 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lsdnk" podUID="04fcdaac-b196-4dea-a077-864b3ee42652" May 06 17:10:29.008163 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.008132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-85dtq" event={"ID":"9ed404eb-a555-4ae7-b728-791f9d60c831","Type":"ContainerStarted","Data":"2f68f3a8dbee555907cb4e992d4a3b046b1fa30c27957dd22fd08386b98f44fb"} May 06 17:10:29.011138 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.011096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lq7gr" event={"ID":"6286127c-14fd-44e0-9034-230ab16d2f46","Type":"ContainerStarted","Data":"6bd537a6c158471ff9b9307ca7ca7193c3d129c76f7a7f3789bb82fa0bc96cdc"} May 06 17:10:29.012490 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.012465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" event={"ID":"643a9363-15b5-4077-948f-22eacf68dede","Type":"ContainerStarted","Data":"41a6764653b0887e316555568d4bfe527d4b57529e2d14fa8e63c4e9c9107105"} May 06 17:10:29.015171 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.015140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ghkv2" event={"ID":"fa50d981-80fc-4dbd-83a3-f8f9cef34743","Type":"ContainerStarted","Data":"916638bb74ed351e2fbd7baf4e69f57ee8cc719b1bd2c0464710960dd019006b"} May 06 17:10:29.016485 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.016453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xx97c" event={"ID":"d074d7a7-13e7-4b23-a0fc-9523795f60e1","Type":"ContainerStarted","Data":"6b029ddc5a10ea22a5a427b92496869674775faed55d608d6437348e62b92389"} May 06 17:10:29.017722 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.017704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" event={"ID":"e4f7af8a-6313-4c92-9c2a-385f8580c399","Type":"ContainerStarted","Data":"9c671872df78960a43233b1e264f4fdfbb8896c382cd3d444237afa96f5c8510"} May 06 17:10:29.042198 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.042159 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-85dtq" podStartSLOduration=7.662113814 podStartE2EDuration="20.042143936s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="2026-05-06 17:10:11.731846513 +0000 UTC m=+3.295001586" lastFinishedPulling="2026-05-06 17:10:24.111876631 +0000 UTC m=+15.675031708" observedRunningTime="2026-05-06 17:10:29.02369769 +0000 UTC m=+20.586852781" watchObservedRunningTime="2026-05-06 17:10:29.042143936 +0000 UTC m=+20.605299026" May 06 17:10:29.056694 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.056645 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ghkv2" podStartSLOduration=3.131477319 podStartE2EDuration="20.056628678s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="2026-05-06 17:10:11.719755557 +0000 UTC m=+3.282910630" lastFinishedPulling="2026-05-06 17:10:28.644906913 +0000 UTC m=+20.208061989" observedRunningTime="2026-05-06 17:10:29.056495633 +0000 UTC m=+20.619650736" watchObservedRunningTime="2026-05-06 17:10:29.056628678 +0000 UTC m=+20.619783769" May 06 17:10:29.086504 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.086445 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lq7gr" podStartSLOduration=3.192185811 podStartE2EDuration="20.086425271s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="2026-05-06 17:10:11.730677871 +0000 UTC m=+3.293832955" lastFinishedPulling="2026-05-06 17:10:28.624917338 +0000 UTC m=+20.188072415" observedRunningTime="2026-05-06 17:10:29.071979554 +0000 UTC m=+20.635134646" watchObservedRunningTime="2026-05-06 17:10:29.086425271 +0000 UTC m=+20.649580362" May 06 17:10:29.086637 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.086555 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xx97c" podStartSLOduration=3.188470937 podStartE2EDuration="20.086548052s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="2026-05-06 17:10:11.728303742 +0000 UTC m=+3.291458814" lastFinishedPulling="2026-05-06 17:10:28.626380856 +0000 UTC m=+20.189535929" observedRunningTime="2026-05-06 17:10:29.086053986 +0000 UTC m=+20.649209077" watchObservedRunningTime="2026-05-06 17:10:29.086548052 +0000 UTC m=+20.649703158" May 06 17:10:29.972552 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.972309 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:29.973289 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:29.973272 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:30.020432 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.020386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jsm9m" event={"ID":"4afd1a68-be62-4fa2-9876-997fdda7a250","Type":"ContainerStarted","Data":"33d2523c9b3921f95146ac910fdc2f9e07dbf5e095bbd8a074e2cffaf120010e"} May 06 17:10:30.021705 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.021675 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4f7af8a-6313-4c92-9c2a-385f8580c399" containerID="9c671872df78960a43233b1e264f4fdfbb8896c382cd3d444237afa96f5c8510" exitCode=0 May 06 17:10:30.021826 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.021750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" event={"ID":"e4f7af8a-6313-4c92-9c2a-385f8580c399","Type":"ContainerDied","Data":"9c671872df78960a43233b1e264f4fdfbb8896c382cd3d444237afa96f5c8510"} May 06 17:10:30.024104 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.024084 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:10:30.024392 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.024371 2576 generic.go:358] "Generic (PLEG): container finished" podID="4870cbd6-d111-4dd5-b84d-b7abb6469f33" containerID="16b516c47406d232f8666822fa9041fe98451151c5ab993d4400334c5e968c9b" exitCode=1 May 06 17:10:30.024484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.024398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" event={"ID":"4870cbd6-d111-4dd5-b84d-b7abb6469f33","Type":"ContainerStarted","Data":"0c5ff5366907820a4f55429547828f4504f29149d8186a44d65e5d9a04d2cb8b"} May 06 17:10:30.024484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.024424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" event={"ID":"4870cbd6-d111-4dd5-b84d-b7abb6469f33","Type":"ContainerStarted","Data":"216ac9828cdb0b7190541c44df2f479a742166e0dadca8d78f322c271560beb9"} May 06 17:10:30.024484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.024437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" event={"ID":"4870cbd6-d111-4dd5-b84d-b7abb6469f33","Type":"ContainerStarted","Data":"832106a9d28ffa64e2d0452792b21a5c19ffca6933a6f23956a05cb73b238b6c"} May 06 17:10:30.024484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.024448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" event={"ID":"4870cbd6-d111-4dd5-b84d-b7abb6469f33","Type":"ContainerDied","Data":"16b516c47406d232f8666822fa9041fe98451151c5ab993d4400334c5e968c9b"} May 06 17:10:30.024484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.024466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" event={"ID":"4870cbd6-d111-4dd5-b84d-b7abb6469f33","Type":"ContainerStarted","Data":"72f1b7626991a678be4e860cd8796caa497ef0905cce192736134b1399f28766"} May 06 17:10:30.025062 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.025044 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:30.025656 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.025637 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lq7gr" May 06 17:10:30.036893 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.036854 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jsm9m" podStartSLOduration=4.14310284 podStartE2EDuration="21.036839784s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="2026-05-06 17:10:11.731125496 +0000 UTC m=+3.294280577" lastFinishedPulling="2026-05-06 17:10:28.624862447 +0000 UTC m=+20.188017521" observedRunningTime="2026-05-06 17:10:30.036314248 +0000 UTC m=+21.599469363" watchObservedRunningTime="2026-05-06 17:10:30.036839784 +0000 UTC m=+21.599994873" May 06 17:10:30.228089 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.228062 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" May 06 17:10:30.928824 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.928787 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:30.929051 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:30.928939 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:30.929051 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:30.929019 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret podName:04fcdaac-b196-4dea-a077-864b3ee42652 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:34.928998981 +0000 UTC m=+26.492154055 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret") pod "global-pull-secret-syncer-lsdnk" (UID: "04fcdaac-b196-4dea-a077-864b3ee42652") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:30.935804 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.935682 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-05-06T17:10:30.228075063Z","UUID":"14d2ffcc-268b-41ef-8a0c-c8c7c12ee60a","Handler":null,"Name":"","Endpoint":""} May 06 17:10:30.937458 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.937426 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 May 06 17:10:30.937458 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.937455 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock May 06 17:10:30.946476 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.946452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:30.946476 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.946466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:30.946652 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:30.946452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:30.946652 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:30.946572 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lsdnk" podUID="04fcdaac-b196-4dea-a077-864b3ee42652" May 06 17:10:30.946893 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:30.946865 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:30.946982 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:30.946924 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:31.029884 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:31.029858 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:10:31.030439 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:31.030315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" event={"ID":"4870cbd6-d111-4dd5-b84d-b7abb6469f33","Type":"ContainerStarted","Data":"a60fbb703d17e31529e1d5223742c982adb691c74bfe69e4936e329ee7148c3c"} May 06 17:10:31.032671 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:31.032640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" event={"ID":"643a9363-15b5-4077-948f-22eacf68dede","Type":"ContainerStarted","Data":"b1491b167df1c3fe200860a8a35f4d43b2ff0097be6bbd66ce5c94e1a95d342c"} May 06 17:10:32.036859 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:32.036641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" event={"ID":"643a9363-15b5-4077-948f-22eacf68dede","Type":"ContainerStarted","Data":"6167318ec95f64c6798827f11a4f7bd3ffd87720b071abdf3d619cac53dbd41c"} May 06 17:10:32.054763 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:32.054721 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bjfnr" podStartSLOduration=3.4116924109999998 podStartE2EDuration="23.054704535s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="2026-05-06 17:10:11.729583252 +0000 UTC m=+3.292738322" lastFinishedPulling="2026-05-06 17:10:31.372595377 +0000 UTC m=+22.935750446" observedRunningTime="2026-05-06 17:10:32.054619721 +0000 UTC m=+23.617774826" watchObservedRunningTime="2026-05-06 17:10:32.054704535 +0000 UTC m=+23.617859628" May 06 17:10:32.945786 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:32.945751 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:32.945983 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:32.945752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:32.945983 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:32.945870 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:32.945983 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:32.945922 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lsdnk" podUID="04fcdaac-b196-4dea-a077-864b3ee42652" May 06 17:10:32.945983 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:32.945751 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:32.946174 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:32.946014 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:33.969100 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:33.969020 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6jd7q"] May 06 17:10:33.980117 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:33.980097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:33.982370 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:33.982350 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" May 06 17:10:33.982464 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:33.982356 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rvt4b\"" May 06 17:10:33.982464 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:33.982358 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" May 06 17:10:34.042075 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.042039 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4f7af8a-6313-4c92-9c2a-385f8580c399" containerID="d71594103ec3ffca408e0ad010ce8c7652486ea96eba4743b9689f4f872643b1" exitCode=0 May 06 17:10:34.042253 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.042119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" event={"ID":"e4f7af8a-6313-4c92-9c2a-385f8580c399","Type":"ContainerDied","Data":"d71594103ec3ffca408e0ad010ce8c7652486ea96eba4743b9689f4f872643b1"} May 06 17:10:34.047325 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.047298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:10:34.048568 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.048544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" event={"ID":"4870cbd6-d111-4dd5-b84d-b7abb6469f33","Type":"ContainerStarted","Data":"d5d7cf24be51b9b3fde3744e2d82ad83259f3e703e0f434d0186b450f312f21c"} May 06 17:10:34.054696 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.054678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/babb97ac-5bf7-447e-9b34-f306f1a7d566-hosts-file\") pod \"node-resolver-6jd7q\" (UID: \"babb97ac-5bf7-447e-9b34-f306f1a7d566\") " pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:34.054785 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.054723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpn9m\" (UniqueName: \"kubernetes.io/projected/babb97ac-5bf7-447e-9b34-f306f1a7d566-kube-api-access-zpn9m\") pod \"node-resolver-6jd7q\" (UID: \"babb97ac-5bf7-447e-9b34-f306f1a7d566\") " pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:34.054846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.054823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/babb97ac-5bf7-447e-9b34-f306f1a7d566-tmp-dir\") pod \"node-resolver-6jd7q\" (UID: \"babb97ac-5bf7-447e-9b34-f306f1a7d566\") " pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:34.155330 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.155292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpn9m\" (UniqueName: \"kubernetes.io/projected/babb97ac-5bf7-447e-9b34-f306f1a7d566-kube-api-access-zpn9m\") pod \"node-resolver-6jd7q\" (UID: \"babb97ac-5bf7-447e-9b34-f306f1a7d566\") " pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:34.155492 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.155370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/babb97ac-5bf7-447e-9b34-f306f1a7d566-tmp-dir\") pod \"node-resolver-6jd7q\" (UID: \"babb97ac-5bf7-447e-9b34-f306f1a7d566\") " pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:34.155492 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.155401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/babb97ac-5bf7-447e-9b34-f306f1a7d566-hosts-file\") pod \"node-resolver-6jd7q\" (UID: \"babb97ac-5bf7-447e-9b34-f306f1a7d566\") " pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:34.155613 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.155561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/babb97ac-5bf7-447e-9b34-f306f1a7d566-hosts-file\") pod \"node-resolver-6jd7q\" (UID: \"babb97ac-5bf7-447e-9b34-f306f1a7d566\") " pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:34.155724 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.155700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/babb97ac-5bf7-447e-9b34-f306f1a7d566-tmp-dir\") pod \"node-resolver-6jd7q\" (UID: \"babb97ac-5bf7-447e-9b34-f306f1a7d566\") " pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:34.165664 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.165643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpn9m\" (UniqueName: \"kubernetes.io/projected/babb97ac-5bf7-447e-9b34-f306f1a7d566-kube-api-access-zpn9m\") pod \"node-resolver-6jd7q\" (UID: \"babb97ac-5bf7-447e-9b34-f306f1a7d566\") " pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:34.288895 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.288874 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6jd7q" May 06 17:10:34.295611 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:34.295584 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbabb97ac_5bf7_447e_9b34_f306f1a7d566.slice/crio-b8d344b62359beb5e97f2879a3fb652408c9f7a5fd48ddf39ab91dd10834f987 WatchSource:0}: Error finding container b8d344b62359beb5e97f2879a3fb652408c9f7a5fd48ddf39ab91dd10834f987: Status 404 returned error can't find the container with id b8d344b62359beb5e97f2879a3fb652408c9f7a5fd48ddf39ab91dd10834f987 May 06 17:10:34.948279 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.948108 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:34.948405 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.948106 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:34.948405 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:34.948380 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:34.948519 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.948165 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:34.948519 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:34.948417 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lsdnk" podUID="04fcdaac-b196-4dea-a077-864b3ee42652" May 06 17:10:34.948519 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:34.948492 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:34.960424 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:34.960397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:34.960542 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:34.960526 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 06 17:10:34.960584 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:34.960576 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret podName:04fcdaac-b196-4dea-a077-864b3ee42652 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.960563643 +0000 UTC m=+34.523718712 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret") pod "global-pull-secret-syncer-lsdnk" (UID: "04fcdaac-b196-4dea-a077-864b3ee42652") : object "kube-system"/"original-pull-secret" not registered May 06 17:10:35.052036 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:35.051999 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6jd7q" event={"ID":"babb97ac-5bf7-447e-9b34-f306f1a7d566","Type":"ContainerStarted","Data":"8b2987f4fada096d539b80ba65488370f2806e243099d86f5a0945c677a579ad"} May 06 17:10:35.052594 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:35.052046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6jd7q" event={"ID":"babb97ac-5bf7-447e-9b34-f306f1a7d566","Type":"ContainerStarted","Data":"b8d344b62359beb5e97f2879a3fb652408c9f7a5fd48ddf39ab91dd10834f987"} May 06 17:10:35.053766 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:35.053747 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4f7af8a-6313-4c92-9c2a-385f8580c399" containerID="d05aa0a91c3de0aafd7f0c7ab9375024a0a5d3dba1cea8825c3b899a8053e17e" exitCode=0 May 06 17:10:35.053864 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:35.053779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" event={"ID":"e4f7af8a-6313-4c92-9c2a-385f8580c399","Type":"ContainerDied","Data":"d05aa0a91c3de0aafd7f0c7ab9375024a0a5d3dba1cea8825c3b899a8053e17e"} May 06 17:10:35.067471 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:35.067428 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6jd7q" podStartSLOduration=2.067414701 podStartE2EDuration="2.067414701s" podCreationTimestamp="2026-05-06 17:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:10:35.067077372 +0000 UTC m=+26.630232457" watchObservedRunningTime="2026-05-06 17:10:35.067414701 +0000 UTC m=+26.630569792" May 06 17:10:36.057404 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.057365 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4f7af8a-6313-4c92-9c2a-385f8580c399" containerID="d83ebfae8edbcf0776cb020d56630d6128e2360358fe66a4d42d0127c7af7a89" exitCode=0 May 06 17:10:36.057793 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.057446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" event={"ID":"e4f7af8a-6313-4c92-9c2a-385f8580c399","Type":"ContainerDied","Data":"d83ebfae8edbcf0776cb020d56630d6128e2360358fe66a4d42d0127c7af7a89"} May 06 17:10:36.060395 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.060331 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:10:36.060655 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.060633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" event={"ID":"4870cbd6-d111-4dd5-b84d-b7abb6469f33","Type":"ContainerStarted","Data":"5be0fa1ee04f3dcc83bcde03a1f14c8509d7386716d0b0284032148849be5e7f"} May 06 17:10:36.061050 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.061033 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:36.061122 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.061058 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:36.061122 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.061068 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:36.061303 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.061183 2576 scope.go:117] "RemoveContainer" containerID="16b516c47406d232f8666822fa9041fe98451151c5ab993d4400334c5e968c9b" May 06 17:10:36.076985 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.076968 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:36.077049 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.077043 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:10:36.945994 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.945958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:36.946186 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.945959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:36.946186 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:36.946102 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lsdnk" podUID="04fcdaac-b196-4dea-a077-864b3ee42652" May 06 17:10:36.946186 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:36.945958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:36.946373 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:36.946175 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:36.946373 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:36.946249 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:37.066377 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:37.066349 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:10:37.066846 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:37.066723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" event={"ID":"4870cbd6-d111-4dd5-b84d-b7abb6469f33","Type":"ContainerStarted","Data":"3214ef713e34a651ec2aecbfcaeefaa9024ad31a274a861c9ea77b07a6e85178"} May 06 17:10:37.099738 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:37.099688 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" podStartSLOduration=10.595241956 podStartE2EDuration="28.099672036s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="2026-05-06 17:10:11.724476017 +0000 UTC m=+3.287631087" lastFinishedPulling="2026-05-06 17:10:29.228906095 +0000 UTC m=+20.792061167" observedRunningTime="2026-05-06 17:10:37.098191928 +0000 UTC m=+28.661347032" watchObservedRunningTime="2026-05-06 17:10:37.099672036 +0000 UTC m=+28.662827128" May 06 17:10:37.552447 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:37.552415 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qht56"] May 06 17:10:37.552581 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:37.552560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:37.552684 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:37.552658 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:37.556931 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:37.556675 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lsdnk"] May 06 17:10:37.556931 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:37.556791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:37.556931 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:37.556894 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lsdnk" podUID="04fcdaac-b196-4dea-a077-864b3ee42652" May 06 17:10:37.557419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:37.557395 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mvgqp"] May 06 17:10:37.557525 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:37.557507 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:37.557625 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:37.557602 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:38.947130 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:38.946950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:38.947573 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:38.947018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:38.947573 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:38.947047 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:38.947573 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:38.947317 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:38.947573 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:38.947191 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lsdnk" podUID="04fcdaac-b196-4dea-a077-864b3ee42652" May 06 17:10:38.947573 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:38.947407 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:40.946374 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:40.946338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:40.946989 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:40.946338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:40.946989 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:40.946452 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qht56" podUID="0d02e3af-24fa-4677-818d-3668647af67f" May 06 17:10:40.946989 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:40.946559 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-lsdnk" podUID="04fcdaac-b196-4dea-a077-864b3ee42652" May 06 17:10:40.946989 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:40.946339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:40.946989 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:40.946690 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mvgqp" podUID="2c28b880-a50d-4878-bf4e-20dc0f464cc2" May 06 17:10:41.793465 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.793435 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-110.ec2.internal" event="NodeReady" May 06 17:10:41.793615 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.793560 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" May 06 17:10:41.832085 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.832049 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-67944dd655-6wfbc"] May 06 17:10:41.846114 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.846090 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xpknn"] May 06 17:10:41.846282 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.846249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:41.848316 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.848296 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" May 06 17:10:41.849681 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.848711 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" May 06 17:10:41.849681 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.848986 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" May 06 17:10:41.849829 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.849683 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7cpjk\"" May 06 17:10:41.854594 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.854573 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" May 06 17:10:41.863445 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.863422 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-b6fc5dcc6-wptnn"] May 06 17:10:41.863597 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.863579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xpknn" May 06 17:10:41.866886 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.866686 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" May 06 17:10:41.867150 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.867133 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hfv2w\"" May 06 17:10:41.867834 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.867666 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" May 06 17:10:41.880590 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.880569 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx"] May 06 17:10:41.880715 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.880700 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:41.886456 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.886438 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" May 06 17:10:41.889712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.886682 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-nz47m\"" May 06 17:10:41.889712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.886850 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" May 06 17:10:41.889712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.886981 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" May 06 17:10:41.889712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.887702 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" May 06 17:10:41.889712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.887751 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" May 06 17:10:41.889712 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.887803 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" May 06 17:10:41.893017 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.892865 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss"] May 06 17:10:41.893149 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.893114 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:41.895435 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.895418 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" May 06 17:10:41.895770 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.895746 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" May 06 17:10:41.895851 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.895812 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-4j44w\"" May 06 17:10:41.895851 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.895825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" May 06 17:10:41.895949 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.895861 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" May 06 17:10:41.906881 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.906546 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm"] May 06 17:10:41.906881 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.906804 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss" May 06 17:10:41.910388 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.910368 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-p26xk\"" May 06 17:10:41.910788 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.910623 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" May 06 17:10:41.910930 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.910367 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" May 06 17:10:41.917542 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.917524 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7"] May 06 17:10:41.917657 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.917643 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:41.920422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.920405 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" May 06 17:10:41.921103 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.920787 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" May 06 17:10:41.921103 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.921003 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-28xhs\"" May 06 17:10:41.921263 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.921147 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" May 06 17:10:41.928004 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.927984 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" May 06 17:10:41.934673 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.934657 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj"] May 06 17:10:41.934840 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.934825 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:41.936913 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.936898 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" May 06 17:10:41.937044 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.937029 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" May 06 17:10:41.937186 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.937173 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-5rpn4\"" May 06 17:10:41.937305 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.937289 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" May 06 17:10:41.947437 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.947418 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-544c98cc96-44vjf"] May 06 17:10:41.947865 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.947560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:41.950883 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.950865 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" May 06 17:10:41.951260 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.951212 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" May 06 17:10:41.951631 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.951616 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" May 06 17:10:41.952344 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.952329 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-z7qkg\"" May 06 17:10:41.952442 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.952428 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" May 06 17:10:41.962899 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.962882 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67944dd655-6wfbc"] May 06 17:10:41.962965 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.962926 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss"] May 06 17:10:41.962965 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.962938 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx"] May 06 17:10:41.962965 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.962947 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xpknn"] May 06 17:10:41.962965 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.962955 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm"] May 06 17:10:41.962965 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.962962 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7"] May 06 17:10:41.963194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.962970 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b6fc5dcc6-wptnn"] May 06 17:10:41.963194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.962979 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k"] May 06 17:10:41.963194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.963082 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:41.966288 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.966273 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" May 06 17:10:41.967676 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.967660 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" May 06 17:10:41.967755 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.967680 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" May 06 17:10:41.967755 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.967662 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" May 06 17:10:41.967865 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.967778 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-h8phn\"" May 06 17:10:41.973744 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.973728 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" May 06 17:10:41.977850 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.977833 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8bzjj"] May 06 17:10:41.977962 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.977950 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" May 06 17:10:41.981480 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.981460 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" May 06 17:10:41.981571 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.981461 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" May 06 17:10:41.981635 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.981588 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" May 06 17:10:41.981844 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.981833 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" May 06 17:10:41.981961 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.981948 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-f78kc\"" May 06 17:10:41.986686 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.986670 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf"] May 06 17:10:41.986800 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.986787 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:41.988726 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.988712 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" May 06 17:10:41.988995 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.988980 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" May 06 17:10:41.989343 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.989325 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" May 06 17:10:41.989571 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.989392 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m5w6t\"" May 06 17:10:41.996057 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.996022 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr"] May 06 17:10:41.996185 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.996162 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:41.999497 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.999476 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" May 06 17:10:41.999605 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.999589 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" May 06 17:10:41.999660 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.999608 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" May 06 17:10:41.999719 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:41.999703 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" May 06 17:10:42.004981 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.004965 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-697665887d-d4v6f"] May 06 17:10:42.005127 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.005110 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.008116 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.008100 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" May 06 17:10:42.011804 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.011787 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5"] May 06 17:10:42.011917 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.011904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:42.013906 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.013893 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zvxkm\"" May 06 17:10:42.013906 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.013907 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" May 06 17:10:42.014209 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.014198 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" May 06 17:10:42.016976 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.016961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e176dc4b-5512-4a3f-b240-34431b23770c-config\") pod \"service-ca-operator-686cb587d-d2qsx\" (UID: \"e176dc4b-5512-4a3f-b240-34431b23770c\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:42.017023 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.016989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4rl5\" (UniqueName: \"kubernetes.io/projected/79528661-fb31-4bfd-9a68-ed4dc761f1b2-kube-api-access-w4rl5\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:42.017023 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.017103 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-installation-pull-secrets\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.017103 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dca80132-2417-4fcf-b18a-e34cee059964-tmp-dir\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.017103 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4tv4\" (UniqueName: \"kubernetes.io/projected/e176dc4b-5512-4a3f-b240-34431b23770c-kube-api-access-l4tv4\") pod \"service-ca-operator-686cb587d-d2qsx\" (UID: \"e176dc4b-5512-4a3f-b240-34431b23770c\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:42.017103 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.017256 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dca80132-2417-4fcf-b18a-e34cee059964-config-volume\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.017256 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqb9k\" (UniqueName: \"kubernetes.io/projected/94a34ae8-6335-4d97-84ed-a5c0f421b59a-kube-api-access-lqb9k\") pod \"kube-storage-version-migrator-operator-649b864788-8z9xm\" (UID: \"94a34ae8-6335-4d97-84ed-a5c0f421b59a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:42.017256 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-trusted-ca\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.017256 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a34ae8-6335-4d97-84ed-a5c0f421b59a-config\") pod \"kube-storage-version-migrator-operator-649b864788-8z9xm\" (UID: \"94a34ae8-6335-4d97-84ed-a5c0f421b59a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:42.017256 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017212 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:42.017484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzg4v\" (UniqueName: \"kubernetes.io/projected/0baf8df0-cec6-4632-8692-64dcfb8359a0-kube-api-access-jzg4v\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.017484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.017484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-stats-auth\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.017484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-bound-sa-token\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.017484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww2sh\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-kube-api-access-ww2sh\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.017484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.017484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b613c30a-43d7-453f-af58-b7ba639e475f-ca-trust-extracted\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.017747 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkg9r\" (UniqueName: \"kubernetes.io/projected/dca80132-2417-4fcf-b18a-e34cee059964-kube-api-access-gkg9r\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.017747 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a34ae8-6335-4d97-84ed-a5c0f421b59a-serving-cert\") pod \"kube-storage-version-migrator-operator-649b864788-8z9xm\" (UID: \"94a34ae8-6335-4d97-84ed-a5c0f421b59a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:42.017747 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e176dc4b-5512-4a3f-b240-34431b23770c-serving-cert\") pod \"service-ca-operator-686cb587d-d2qsx\" (UID: \"e176dc4b-5512-4a3f-b240-34431b23770c\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:42.017747 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017609 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-image-registry-private-configuration\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.017747 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-default-certificate\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.017747 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-registry-certificates\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.017747 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.017696 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd8zn\" (UniqueName: \"kubernetes.io/projected/76b87b63-406b-4b3f-80ef-77d34e6a3f8f-kube-api-access-hd8zn\") pod \"volume-data-source-validator-6648d555c9-w78ss\" (UID: \"76b87b63-406b-4b3f-80ef-77d34e6a3f8f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss" May 06 17:10:42.019440 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.019424 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-77758f4558-6vwjj"] May 06 17:10:42.019566 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.019552 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5" May 06 17:10:42.024426 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.024397 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" May 06 17:10:42.024526 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.024434 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" May 06 17:10:42.024711 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.024694 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-9w57v\"" May 06 17:10:42.026638 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.026620 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5"] May 06 17:10:42.026727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.026647 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8bzjj"] May 06 17:10:42.026727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.026661 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k"] May 06 17:10:42.026727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.026675 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-697665887d-d4v6f"] May 06 17:10:42.026727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.026687 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj"] May 06 17:10:42.026727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.026700 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf"] May 06 17:10:42.026727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.026717 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-544c98cc96-44vjf"] May 06 17:10:42.026727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.026728 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr"] May 06 17:10:42.027006 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.026740 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-77758f4558-6vwjj"] May 06 17:10:42.027006 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.026746 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.029112 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.029095 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" May 06 17:10:42.029167 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.029095 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-nxmzk\"" May 06 17:10:42.029335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.029313 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" May 06 17:10:42.029545 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.029419 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" May 06 17:10:42.029545 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.029435 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" May 06 17:10:42.035716 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.035701 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" May 06 17:10:42.118032 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ckd7\" (UniqueName: \"kubernetes.io/projected/847e4e37-03fb-4158-99da-8a78c4311404-kube-api-access-8ckd7\") pod \"managed-serviceaccount-addon-agent-65967864cb-psj8k\" (UID: \"847e4e37-03fb-4158-99da-8a78c4311404\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" May 06 17:10:42.118163 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:42.118163 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhnf9\" (UniqueName: \"kubernetes.io/projected/a2139d10-b7a2-47a8-8697-edc7d34841c7-kube-api-access-jhnf9\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.118163 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e176dc4b-5512-4a3f-b240-34431b23770c-config\") pod \"service-ca-operator-686cb587d-d2qsx\" (UID: \"e176dc4b-5512-4a3f-b240-34431b23770c\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:42.118163 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r4j5\" (UniqueName: \"kubernetes.io/projected/0a0d2aea-3f9f-492b-82e1-39fb30f63a09-kube-api-access-8r4j5\") pod \"network-check-source-6859b67c86-lz2h5\" (UID: \"0a0d2aea-3f9f-492b-82e1-39fb30f63a09\") " pod="openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5" May 06 17:10:42.118163 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4lbl\" (UniqueName: \"kubernetes.io/projected/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-kube-api-access-v4lbl\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:42.118163 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.118473 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.118473 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2139d10-b7a2-47a8-8697-edc7d34841c7-trusted-ca-bundle\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.118473 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2139d10-b7a2-47a8-8697-edc7d34841c7-service-ca-bundle\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.118473 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.118340 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:10:42.118473 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.118352 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67944dd655-6wfbc: secret "image-registry-tls" not found May 06 17:10:42.118473 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.118401 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls podName:b613c30a-43d7-453f-af58-b7ba639e475f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.618384837 +0000 UTC m=+34.181539906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls") pod "image-registry-67944dd655-6wfbc" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f") : secret "image-registry-tls" not found May 06 17:10:42.118473 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-installation-pull-secrets\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.118473 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/faf20125-0ae8-4e08-a5c0-c453bf3b3647-klusterlet-config\") pod \"klusterlet-addon-workmgr-5865b48697-m64cr\" (UID: \"faf20125-0ae8-4e08-a5c0-c453bf3b3647\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.118473 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2139d10-b7a2-47a8-8697-edc7d34841c7-serving-cert\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dca80132-2417-4fcf-b18a-e34cee059964-config-volume\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e176dc4b-5512-4a3f-b240-34431b23770c-config\") pod \"service-ca-operator-686cb587d-d2qsx\" (UID: \"e176dc4b-5512-4a3f-b240-34431b23770c\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.118623 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqb9k\" (UniqueName: \"kubernetes.io/projected/94a34ae8-6335-4d97-84ed-a5c0f421b59a-kube-api-access-lqb9k\") pod \"kube-storage-version-migrator-operator-649b864788-8z9xm\" (UID: \"94a34ae8-6335-4d97-84ed-a5c0f421b59a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.118675 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.618660765 +0000 UTC m=+34.181815843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : secret "router-metrics-certs-default" not found May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzg4v\" (UniqueName: \"kubernetes.io/projected/0baf8df0-cec6-4632-8692-64dcfb8359a0-kube-api-access-jzg4v\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-stats-auth\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwtmj\" (UniqueName: \"kubernetes.io/projected/faf20125-0ae8-4e08-a5c0-c453bf3b3647-kube-api-access-jwtmj\") pod \"klusterlet-addon-workmgr-5865b48697-m64cr\" (UID: \"faf20125-0ae8-4e08-a5c0-c453bf3b3647\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.118877 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b613c30a-43d7-453f-af58-b7ba639e475f-ca-trust-extracted\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.119533 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8552f8df-c056-4938-9998-2d4462c67e4b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.119533 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.118938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-default-certificate\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.119533 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.119035 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.619019035 +0000 UTC m=+34.182174127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : configmap references non-existent config key: service-ca.crt May 06 17:10:42.119533 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.119067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-registry-certificates\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.119533 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.119090 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dca80132-2417-4fcf-b18a-e34cee059964-config-volume\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.119533 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.119099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-ca\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.119533 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.119204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkg9r\" (UniqueName: \"kubernetes.io/projected/dca80132-2417-4fcf-b18a-e34cee059964-kube-api-access-gkg9r\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.119533 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.119272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e176dc4b-5512-4a3f-b240-34431b23770c-serving-cert\") pod \"service-ca-operator-686cb587d-d2qsx\" (UID: \"e176dc4b-5512-4a3f-b240-34431b23770c\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:42.119533 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.119297 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b613c30a-43d7-453f-af58-b7ba639e475f-ca-trust-extracted\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.119919 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.119301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/847e4e37-03fb-4158-99da-8a78c4311404-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65967864cb-psj8k\" (UID: \"847e4e37-03fb-4158-99da-8a78c4311404\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" May 06 17:10:42.120142 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.120109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-registry-certificates\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.120201 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.120184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbj8n\" (UniqueName: \"kubernetes.io/projected/8552f8df-c056-4938-9998-2d4462c67e4b-kube-api-access-nbj8n\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.120362 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.120341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a34ae8-6335-4d97-84ed-a5c0f421b59a-serving-cert\") pod \"kube-storage-version-migrator-operator-649b864788-8z9xm\" (UID: \"94a34ae8-6335-4d97-84ed-a5c0f421b59a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:42.120443 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.120407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-image-registry-private-configuration\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.120495 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.120445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a2139d10-b7a2-47a8-8697-edc7d34841c7-snapshots\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.120938 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.120730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-nginx-conf\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:42.120938 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.120803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee523985-adee-4039-8a66-2e7b0a68522a-trusted-ca\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.120938 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.120862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hd8zn\" (UniqueName: \"kubernetes.io/projected/76b87b63-406b-4b3f-80ef-77d34e6a3f8f-kube-api-access-hd8zn\") pod \"volume-data-source-validator-6648d555c9-w78ss\" (UID: \"76b87b63-406b-4b3f-80ef-77d34e6a3f8f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss" May 06 17:10:42.120938 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.120894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqsq9\" (UniqueName: \"kubernetes.io/projected/ee523985-adee-4039-8a66-2e7b0a68522a-kube-api-access-nqsq9\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4rl5\" (UniqueName: \"kubernetes.io/projected/79528661-fb31-4bfd-9a68-ed4dc761f1b2-kube-api-access-w4rl5\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcd4h\" (UniqueName: \"kubernetes.io/projected/33943d98-8cd7-499f-b152-50856d1a3e54-kube-api-access-bcd4h\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-telemetry-config\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dca80132-2417-4fcf-b18a-e34cee059964-tmp-dir\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4tv4\" (UniqueName: \"kubernetes.io/projected/e176dc4b-5512-4a3f-b240-34431b23770c-kube-api-access-l4tv4\") pod \"service-ca-operator-686cb587d-d2qsx\" (UID: \"e176dc4b-5512-4a3f-b240-34431b23770c\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-hub\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a2139d10-b7a2-47a8-8697-edc7d34841c7-tmp\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee523985-adee-4039-8a66-2e7b0a68522a-serving-cert\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-trusted-ca\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee523985-adee-4039-8a66-2e7b0a68522a-config\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a34ae8-6335-4d97-84ed-a5c0f421b59a-config\") pod \"kube-storage-version-migrator-operator-649b864788-8z9xm\" (UID: \"94a34ae8-6335-4d97-84ed-a5c0f421b59a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-bound-sa-token\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ww2sh\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-kube-api-access-ww2sh\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.122092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.121725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/faf20125-0ae8-4e08-a5c0-c453bf3b3647-tmp\") pod \"klusterlet-addon-workmgr-5865b48697-m64cr\" (UID: \"faf20125-0ae8-4e08-a5c0-c453bf3b3647\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.122860 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.122560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a34ae8-6335-4d97-84ed-a5c0f421b59a-config\") pod \"kube-storage-version-migrator-operator-649b864788-8z9xm\" (UID: \"94a34ae8-6335-4d97-84ed-a5c0f421b59a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:42.122860 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.122586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-installation-pull-secrets\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.122860 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.122597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-default-certificate\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.122860 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.122699 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e176dc4b-5512-4a3f-b240-34431b23770c-serving-cert\") pod \"service-ca-operator-686cb587d-d2qsx\" (UID: \"e176dc4b-5512-4a3f-b240-34431b23770c\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:42.123031 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.122928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-stats-auth\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.123876 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.123248 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:10:42.123876 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.123302 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls podName:dca80132-2417-4fcf-b18a-e34cee059964 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.623289021 +0000 UTC m=+34.186444104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls") pod "dns-default-xpknn" (UID: "dca80132-2417-4fcf-b18a-e34cee059964") : secret "dns-default-metrics-tls" not found May 06 17:10:42.123876 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.123324 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 06 17:10:42.123876 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.123378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dca80132-2417-4fcf-b18a-e34cee059964-tmp-dir\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.123876 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.123399 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls podName:79528661-fb31-4bfd-9a68-ed4dc761f1b2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.623381046 +0000 UTC m=+34.186536118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-46lq7" (UID: "79528661-fb31-4bfd-9a68-ed4dc761f1b2") : secret "samples-operator-tls" not found May 06 17:10:42.124182 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.124159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a34ae8-6335-4d97-84ed-a5c0f421b59a-serving-cert\") pod \"kube-storage-version-migrator-operator-649b864788-8z9xm\" (UID: \"94a34ae8-6335-4d97-84ed-a5c0f421b59a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:42.124593 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.124571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-trusted-ca\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.126863 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.126843 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-image-registry-private-configuration\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.129598 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.129580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqb9k\" (UniqueName: \"kubernetes.io/projected/94a34ae8-6335-4d97-84ed-a5c0f421b59a-kube-api-access-lqb9k\") pod \"kube-storage-version-migrator-operator-649b864788-8z9xm\" (UID: \"94a34ae8-6335-4d97-84ed-a5c0f421b59a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:42.130735 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.130714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkg9r\" (UniqueName: \"kubernetes.io/projected/dca80132-2417-4fcf-b18a-e34cee059964-kube-api-access-gkg9r\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.130925 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.130905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzg4v\" (UniqueName: \"kubernetes.io/projected/0baf8df0-cec6-4632-8692-64dcfb8359a0-kube-api-access-jzg4v\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.132124 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.132099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd8zn\" (UniqueName: \"kubernetes.io/projected/76b87b63-406b-4b3f-80ef-77d34e6a3f8f-kube-api-access-hd8zn\") pod \"volume-data-source-validator-6648d555c9-w78ss\" (UID: \"76b87b63-406b-4b3f-80ef-77d34e6a3f8f\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss" May 06 17:10:42.133076 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.133058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww2sh\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-kube-api-access-ww2sh\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.133299 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.133283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4tv4\" (UniqueName: \"kubernetes.io/projected/e176dc4b-5512-4a3f-b240-34431b23770c-kube-api-access-l4tv4\") pod \"service-ca-operator-686cb587d-d2qsx\" (UID: \"e176dc4b-5512-4a3f-b240-34431b23770c\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:42.133429 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.133415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-bound-sa-token\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.134300 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.134284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4rl5\" (UniqueName: \"kubernetes.io/projected/79528661-fb31-4bfd-9a68-ed4dc761f1b2-kube-api-access-w4rl5\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:42.201872 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.201847 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" May 06 17:10:42.216046 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.216023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss" May 06 17:10:42.222835 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.222812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2139d10-b7a2-47a8-8697-edc7d34841c7-trusted-ca-bundle\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.222923 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.222841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2139d10-b7a2-47a8-8697-edc7d34841c7-service-ca-bundle\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.222923 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.222862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/faf20125-0ae8-4e08-a5c0-c453bf3b3647-klusterlet-config\") pod \"klusterlet-addon-workmgr-5865b48697-m64cr\" (UID: \"faf20125-0ae8-4e08-a5c0-c453bf3b3647\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.222923 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.222880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:42.223082 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2139d10-b7a2-47a8-8697-edc7d34841c7-serving-cert\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.223082 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:42.223176 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.223084 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 06 17:10:42.223176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.223176 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.223146 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert podName:3bdb3b38-b44f-4385-b653-2e7de1f5dcbc nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.723123901 +0000 UTC m=+34.286278976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-d4v6f" (UID: "3bdb3b38-b44f-4385-b653-2e7de1f5dcbc") : secret "networking-console-plugin-cert" not found May 06 17:10:42.223352 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223183 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwtmj\" (UniqueName: \"kubernetes.io/projected/faf20125-0ae8-4e08-a5c0-c453bf3b3647-kube-api-access-jwtmj\") pod \"klusterlet-addon-workmgr-5865b48697-m64cr\" (UID: \"faf20125-0ae8-4e08-a5c0-c453bf3b3647\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.223352 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8552f8df-c056-4938-9998-2d4462c67e4b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.223352 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-ca\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.223352 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/847e4e37-03fb-4158-99da-8a78c4311404-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65967864cb-psj8k\" (UID: \"847e4e37-03fb-4158-99da-8a78c4311404\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" May 06 17:10:42.223352 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbj8n\" (UniqueName: \"kubernetes.io/projected/8552f8df-c056-4938-9998-2d4462c67e4b-kube-api-access-nbj8n\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.223584 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a2139d10-b7a2-47a8-8697-edc7d34841c7-snapshots\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.223584 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-nginx-conf\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:42.223584 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee523985-adee-4039-8a66-2e7b0a68522a-trusted-ca\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.223584 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223457 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqsq9\" (UniqueName: \"kubernetes.io/projected/ee523985-adee-4039-8a66-2e7b0a68522a-kube-api-access-nqsq9\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.223584 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bcd4h\" (UniqueName: \"kubernetes.io/projected/33943d98-8cd7-499f-b152-50856d1a3e54-kube-api-access-bcd4h\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:42.223584 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-telemetry-config\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:42.223584 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2139d10-b7a2-47a8-8697-edc7d34841c7-service-ca-bundle\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.223895 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.223801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2139d10-b7a2-47a8-8697-edc7d34841c7-trusted-ca-bundle\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.223895 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.223803 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 06 17:10:42.223895 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.223877 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls podName:3d080dd6-8304-4853-9ab3-bd27a2fdd22a nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.723858243 +0000 UTC m=+34.287013324 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-9kdcj" (UID: "3d080dd6-8304-4853-9ab3-bd27a2fdd22a") : secret "cluster-monitoring-operator-tls" not found May 06 17:10:42.224210 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.224190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-telemetry-config\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:42.224544 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.224516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-nginx-conf\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.224907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/8552f8df-c056-4938-9998-2d4462c67e4b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.224965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-hub\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.224991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a2139d10-b7a2-47a8-8697-edc7d34841c7-tmp\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee523985-adee-4039-8a66-2e7b0a68522a-serving-cert\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee523985-adee-4039-8a66-2e7b0a68522a-config\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/faf20125-0ae8-4e08-a5c0-c453bf3b3647-tmp\") pod \"klusterlet-addon-workmgr-5865b48697-m64cr\" (UID: \"faf20125-0ae8-4e08-a5c0-c453bf3b3647\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ckd7\" (UniqueName: \"kubernetes.io/projected/847e4e37-03fb-4158-99da-8a78c4311404-kube-api-access-8ckd7\") pod \"managed-serviceaccount-addon-agent-65967864cb-psj8k\" (UID: \"847e4e37-03fb-4158-99da-8a78c4311404\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee523985-adee-4039-8a66-2e7b0a68522a-trusted-ca\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhnf9\" (UniqueName: \"kubernetes.io/projected/a2139d10-b7a2-47a8-8697-edc7d34841c7-kube-api-access-jhnf9\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.225276 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r4j5\" (UniqueName: \"kubernetes.io/projected/0a0d2aea-3f9f-492b-82e1-39fb30f63a09-kube-api-access-8r4j5\") pod \"network-check-source-6859b67c86-lz2h5\" (UID: \"0a0d2aea-3f9f-492b-82e1-39fb30f63a09\") " pod="openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.225316 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert podName:33943d98-8cd7-499f-b152-50856d1a3e54 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:42.725303006 +0000 UTC m=+34.288458082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert") pod "ingress-canary-8bzjj" (UID: "33943d98-8cd7-499f-b152-50856d1a3e54") : secret "canary-serving-cert" not found May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a2139d10-b7a2-47a8-8697-edc7d34841c7-snapshots\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4lbl\" (UniqueName: \"kubernetes.io/projected/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-kube-api-access-v4lbl\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:42.225680 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.225374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.226483 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.226023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/faf20125-0ae8-4e08-a5c0-c453bf3b3647-tmp\") pod \"klusterlet-addon-workmgr-5865b48697-m64cr\" (UID: \"faf20125-0ae8-4e08-a5c0-c453bf3b3647\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.226483 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.226117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee523985-adee-4039-8a66-2e7b0a68522a-config\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.226483 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.226186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" May 06 17:10:42.226483 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.226314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a2139d10-b7a2-47a8-8697-edc7d34841c7-tmp\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.229391 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.229352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.230004 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.229984 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-hub\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.231002 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.230457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/847e4e37-03fb-4158-99da-8a78c4311404-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-65967864cb-psj8k\" (UID: \"847e4e37-03fb-4158-99da-8a78c4311404\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" May 06 17:10:42.231452 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.230622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/faf20125-0ae8-4e08-a5c0-c453bf3b3647-klusterlet-config\") pod \"klusterlet-addon-workmgr-5865b48697-m64cr\" (UID: \"faf20125-0ae8-4e08-a5c0-c453bf3b3647\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.231605 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.230772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2139d10-b7a2-47a8-8697-edc7d34841c7-serving-cert\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.231605 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.231402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee523985-adee-4039-8a66-2e7b0a68522a-serving-cert\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.234464 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.232966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-ca\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.234464 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.233407 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/8552f8df-c056-4938-9998-2d4462c67e4b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.234464 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.233450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqsq9\" (UniqueName: \"kubernetes.io/projected/ee523985-adee-4039-8a66-2e7b0a68522a-kube-api-access-nqsq9\") pod \"console-operator-77758f4558-6vwjj\" (UID: \"ee523985-adee-4039-8a66-2e7b0a68522a\") " pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.234464 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.233997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbj8n\" (UniqueName: \"kubernetes.io/projected/8552f8df-c056-4938-9998-2d4462c67e4b-kube-api-access-nbj8n\") pod \"cluster-proxy-proxy-agent-587f68fff5-zzqnf\" (UID: \"8552f8df-c056-4938-9998-2d4462c67e4b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.235358 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.235188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwtmj\" (UniqueName: \"kubernetes.io/projected/faf20125-0ae8-4e08-a5c0-c453bf3b3647-kube-api-access-jwtmj\") pod \"klusterlet-addon-workmgr-5865b48697-m64cr\" (UID: \"faf20125-0ae8-4e08-a5c0-c453bf3b3647\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.236042 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.235997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcd4h\" (UniqueName: \"kubernetes.io/projected/33943d98-8cd7-499f-b152-50856d1a3e54-kube-api-access-bcd4h\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:42.236783 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.236725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhnf9\" (UniqueName: \"kubernetes.io/projected/a2139d10-b7a2-47a8-8697-edc7d34841c7-kube-api-access-jhnf9\") pod \"insights-operator-544c98cc96-44vjf\" (UID: \"a2139d10-b7a2-47a8-8697-edc7d34841c7\") " pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.237182 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.237155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ckd7\" (UniqueName: \"kubernetes.io/projected/847e4e37-03fb-4158-99da-8a78c4311404-kube-api-access-8ckd7\") pod \"managed-serviceaccount-addon-agent-65967864cb-psj8k\" (UID: \"847e4e37-03fb-4158-99da-8a78c4311404\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" May 06 17:10:42.238048 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.238027 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r4j5\" (UniqueName: \"kubernetes.io/projected/0a0d2aea-3f9f-492b-82e1-39fb30f63a09-kube-api-access-8r4j5\") pod \"network-check-source-6859b67c86-lz2h5\" (UID: \"0a0d2aea-3f9f-492b-82e1-39fb30f63a09\") " pod="openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5" May 06 17:10:42.238675 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.238650 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4lbl\" (UniqueName: \"kubernetes.io/projected/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-kube-api-access-v4lbl\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:42.271443 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.271411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-544c98cc96-44vjf" May 06 17:10:42.294021 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.293264 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" May 06 17:10:42.309686 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.307937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:10:42.327260 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.326526 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:42.335400 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.334822 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5" May 06 17:10:42.348784 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.348632 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:10:42.457764 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.457266 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm"] May 06 17:10:42.470718 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.470673 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss"] May 06 17:10:42.486617 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.486558 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx"] May 06 17:10:42.499121 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.499095 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-544c98cc96-44vjf"] May 06 17:10:42.525025 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.524998 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf"] May 06 17:10:42.531484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.531456 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k"] May 06 17:10:42.540876 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.540855 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr"] May 06 17:10:42.544244 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:42.544194 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8552f8df_c056_4938_9998_2d4462c67e4b.slice/crio-e05531869037167ef35222c9de5dff3db3799ddac15356581dd8dad8652b5199 WatchSource:0}: Error finding container e05531869037167ef35222c9de5dff3db3799ddac15356581dd8dad8652b5199: Status 404 returned error can't find the container with id e05531869037167ef35222c9de5dff3db3799ddac15356581dd8dad8652b5199 May 06 17:10:42.544970 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:42.544939 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod847e4e37_03fb_4158_99da_8a78c4311404.slice/crio-728ecec9e905462579bd0795a884c4a27d2ffb6cbe9aea233275d8b7cc427fca WatchSource:0}: Error finding container 728ecec9e905462579bd0795a884c4a27d2ffb6cbe9aea233275d8b7cc427fca: Status 404 returned error can't find the container with id 728ecec9e905462579bd0795a884c4a27d2ffb6cbe9aea233275d8b7cc427fca May 06 17:10:42.560143 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:42.560018 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode176dc4b_5512_4a3f_b240_34431b23770c.slice/crio-388d92d65b8865eb4a6bc542cf0033b72064627c63e1b2fad500025d5b3ab6ef WatchSource:0}: Error finding container 388d92d65b8865eb4a6bc542cf0033b72064627c63e1b2fad500025d5b3ab6ef: Status 404 returned error can't find the container with id 388d92d65b8865eb4a6bc542cf0033b72064627c63e1b2fad500025d5b3ab6ef May 06 17:10:42.561152 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:42.561087 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a34ae8_6335_4d97_84ed_a5c0f421b59a.slice/crio-37e15186e4446bc265b6e4af4466882834c90829e69008983fff1142e07149fa WatchSource:0}: Error finding container 37e15186e4446bc265b6e4af4466882834c90829e69008983fff1142e07149fa: Status 404 returned error can't find the container with id 37e15186e4446bc265b6e4af4466882834c90829e69008983fff1142e07149fa May 06 17:10:42.561388 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:42.561364 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2139d10_b7a2_47a8_8697_edc7d34841c7.slice/crio-6e9f20191a6285abaa8cf8585dbf8d077ac1223b6297f53209d19c36cb02839b WatchSource:0}: Error finding container 6e9f20191a6285abaa8cf8585dbf8d077ac1223b6297f53209d19c36cb02839b: Status 404 returned error can't find the container with id 6e9f20191a6285abaa8cf8585dbf8d077ac1223b6297f53209d19c36cb02839b May 06 17:10:42.629049 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.629027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:42.629208 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.629187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.629314 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629203 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:10:42.629314 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.629271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:42.629314 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629274 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67944dd655-6wfbc: secret "image-registry-tls" not found May 06 17:10:42.629448 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629374 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 06 17:10:42.629448 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629375 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls podName:b613c30a-43d7-453f-af58-b7ba639e475f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:43.629356846 +0000 UTC m=+35.192511929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls") pod "image-registry-67944dd655-6wfbc" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f") : secret "image-registry-tls" not found May 06 17:10:42.629448 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629437 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:43.629420683 +0000 UTC m=+35.192575757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : secret "router-metrics-certs-default" not found May 06 17:10:42.629641 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.629469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:42.629641 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:43.629476258 +0000 UTC m=+35.192631359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : configmap references non-existent config key: service-ca.crt May 06 17:10:42.629641 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.629572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:42.629641 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629575 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:42.629641 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629627 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs podName:2c28b880-a50d-4878-bf4e-20dc0f464cc2 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:14.629616128 +0000 UTC m=+66.192771200 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs") pod "network-metrics-daemon-mvgqp" (UID: "2c28b880-a50d-4878-bf4e-20dc0f464cc2") : object "openshift-multus"/"metrics-daemon-secret" not registered May 06 17:10:42.629817 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629645 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 06 17:10:42.629817 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629696 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls podName:79528661-fb31-4bfd-9a68-ed4dc761f1b2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:43.629682138 +0000 UTC m=+35.192837224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-46lq7" (UID: "79528661-fb31-4bfd-9a68-ed4dc761f1b2") : secret "samples-operator-tls" not found May 06 17:10:42.629817 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.629728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:42.629907 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629854 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:10:42.629907 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.629891 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls podName:dca80132-2417-4fcf-b18a-e34cee059964 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:43.629878841 +0000 UTC m=+35.193033912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls") pod "dns-default-xpknn" (UID: "dca80132-2417-4fcf-b18a-e34cee059964") : secret "dns-default-metrics-tls" not found May 06 17:10:42.688773 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.688734 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5"] May 06 17:10:42.692891 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.692872 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-77758f4558-6vwjj"] May 06 17:10:42.695699 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:42.695662 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a0d2aea_3f9f_492b_82e1_39fb30f63a09.slice/crio-ca6d838694754ee27d84bccb7322c18dae3926c9902ff55c9665d7f99278cf1c WatchSource:0}: Error finding container ca6d838694754ee27d84bccb7322c18dae3926c9902ff55c9665d7f99278cf1c: Status 404 returned error can't find the container with id ca6d838694754ee27d84bccb7322c18dae3926c9902ff55c9665d7f99278cf1c May 06 17:10:42.696180 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:42.696155 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee523985_adee_4039_8a66_2e7b0a68522a.slice/crio-9eb8a5682d8e5329f887ad28339228264ef06d36bfadfe93a95506938e7dbaf3 WatchSource:0}: Error finding container 9eb8a5682d8e5329f887ad28339228264ef06d36bfadfe93a95506938e7dbaf3: Status 404 returned error can't find the container with id 9eb8a5682d8e5329f887ad28339228264ef06d36bfadfe93a95506938e7dbaf3 May 06 17:10:42.730776 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.730747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:42.730890 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.730793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:42.730890 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.730869 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 06 17:10:42.730890 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.730873 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 06 17:10:42.731001 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.730910 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls podName:3d080dd6-8304-4853-9ab3-bd27a2fdd22a nodeName:}" failed. No retries permitted until 2026-05-06 17:10:43.730897806 +0000 UTC m=+35.294052875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-9kdcj" (UID: "3d080dd6-8304-4853-9ab3-bd27a2fdd22a") : secret "cluster-monitoring-operator-tls" not found May 06 17:10:42.731001 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.730931 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert podName:3bdb3b38-b44f-4385-b653-2e7de1f5dcbc nodeName:}" failed. No retries permitted until 2026-05-06 17:10:43.73092487 +0000 UTC m=+35.294079939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-d4v6f" (UID: "3bdb3b38-b44f-4385-b653-2e7de1f5dcbc") : secret "networking-console-plugin-cert" not found May 06 17:10:42.731070 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.731043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:42.731141 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.731130 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:10:42.731179 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:42.731158 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert podName:33943d98-8cd7-499f-b152-50856d1a3e54 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:43.731150677 +0000 UTC m=+35.294305746 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert") pod "ingress-canary-8bzjj" (UID: "33943d98-8cd7-499f-b152-50856d1a3e54") : secret "canary-serving-cert" not found May 06 17:10:42.832404 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.832323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrbj\" (UniqueName: \"kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj\") pod \"network-check-target-qht56\" (UID: \"0d02e3af-24fa-4677-818d-3668647af67f\") " pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:42.835914 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.835890 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbrbj\" (UniqueName: \"kubernetes.io/projected/0d02e3af-24fa-4677-818d-3668647af67f-kube-api-access-cbrbj\") pod \"network-check-target-qht56\" (UID: \"0d02e3af-24fa-4677-818d-3668647af67f\") " pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:42.945901 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.945872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:10:42.946053 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.945910 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:42.946053 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.945872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:42.948464 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.948435 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" May 06 17:10:42.948464 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.948457 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" May 06 17:10:42.948953 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.948458 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7hgjs\"" May 06 17:10:42.948953 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.948571 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c658w\"" May 06 17:10:42.963146 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:42.963022 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:43.034777 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.034741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:43.037781 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.037730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/04fcdaac-b196-4dea-a077-864b3ee42652-original-pull-secret\") pod \"global-pull-secret-syncer-lsdnk\" (UID: \"04fcdaac-b196-4dea-a077-864b3ee42652\") " pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:43.077484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.077454 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qht56"] May 06 17:10:43.078929 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.078896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" event={"ID":"847e4e37-03fb-4158-99da-8a78c4311404","Type":"ContainerStarted","Data":"728ecec9e905462579bd0795a884c4a27d2ffb6cbe9aea233275d8b7cc427fca"} May 06 17:10:43.079827 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.079804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" event={"ID":"94a34ae8-6335-4d97-84ed-a5c0f421b59a","Type":"ContainerStarted","Data":"37e15186e4446bc265b6e4af4466882834c90829e69008983fff1142e07149fa"} May 06 17:10:43.080782 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.080755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss" event={"ID":"76b87b63-406b-4b3f-80ef-77d34e6a3f8f","Type":"ContainerStarted","Data":"4a40537f819993a6163e5aedecda9f6c843ccb279d89a2ca74b253eb565c8cd6"} May 06 17:10:43.081465 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:10:43.081440 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d02e3af_24fa_4677_818d_3668647af67f.slice/crio-a46e3f8d6ee0358b9d626d2cb1de90060ef5299d26f43b2408b03f8608ec7829 WatchSource:0}: Error finding container a46e3f8d6ee0358b9d626d2cb1de90060ef5299d26f43b2408b03f8608ec7829: Status 404 returned error can't find the container with id a46e3f8d6ee0358b9d626d2cb1de90060ef5299d26f43b2408b03f8608ec7829 May 06 17:10:43.081891 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.081874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" event={"ID":"faf20125-0ae8-4e08-a5c0-c453bf3b3647","Type":"ContainerStarted","Data":"f59200fb03492bc3ebae5e655c1b4af240a9953ed500f03a7ee4004536fcbb9c"} May 06 17:10:43.082837 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.082791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" event={"ID":"ee523985-adee-4039-8a66-2e7b0a68522a","Type":"ContainerStarted","Data":"9eb8a5682d8e5329f887ad28339228264ef06d36bfadfe93a95506938e7dbaf3"} May 06 17:10:43.083775 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.083725 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-544c98cc96-44vjf" event={"ID":"a2139d10-b7a2-47a8-8697-edc7d34841c7","Type":"ContainerStarted","Data":"6e9f20191a6285abaa8cf8585dbf8d077ac1223b6297f53209d19c36cb02839b"} May 06 17:10:43.084581 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.084555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" event={"ID":"e176dc4b-5512-4a3f-b240-34431b23770c","Type":"ContainerStarted","Data":"388d92d65b8865eb4a6bc542cf0033b72064627c63e1b2fad500025d5b3ab6ef"} May 06 17:10:43.085408 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.085391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5" event={"ID":"0a0d2aea-3f9f-492b-82e1-39fb30f63a09","Type":"ContainerStarted","Data":"ca6d838694754ee27d84bccb7322c18dae3926c9902ff55c9665d7f99278cf1c"} May 06 17:10:43.087760 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.087738 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4f7af8a-6313-4c92-9c2a-385f8580c399" containerID="073c5ba7bc8117be455db6aef9428bdb0017da964a2c598d7fd7bbb772983ea9" exitCode=0 May 06 17:10:43.087834 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.087807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" event={"ID":"e4f7af8a-6313-4c92-9c2a-385f8580c399","Type":"ContainerDied","Data":"073c5ba7bc8117be455db6aef9428bdb0017da964a2c598d7fd7bbb772983ea9"} May 06 17:10:43.088731 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.088715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" event={"ID":"8552f8df-c056-4938-9998-2d4462c67e4b","Type":"ContainerStarted","Data":"e05531869037167ef35222c9de5dff3db3799ddac15356581dd8dad8652b5199"} May 06 17:10:43.268859 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.268488 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-lsdnk" May 06 17:10:43.467096 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.467044 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-lsdnk"] May 06 17:10:43.643454 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.643362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:43.643454 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.643438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:43.643768 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.643524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:43.643768 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.643558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:43.643768 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.643624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:43.643768 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.643758 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:10:43.643968 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.643773 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67944dd655-6wfbc: secret "image-registry-tls" not found May 06 17:10:43.643968 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.643832 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls podName:b613c30a-43d7-453f-af58-b7ba639e475f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:45.643811795 +0000 UTC m=+37.206966870 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls") pod "image-registry-67944dd655-6wfbc" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f") : secret "image-registry-tls" not found May 06 17:10:43.644535 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.644298 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 06 17:10:43.644535 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.644334 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:10:43.644535 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.644363 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls podName:79528661-fb31-4bfd-9a68-ed4dc761f1b2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:45.644345808 +0000 UTC m=+37.207500896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-46lq7" (UID: "79528661-fb31-4bfd-9a68-ed4dc761f1b2") : secret "samples-operator-tls" not found May 06 17:10:43.644535 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.644382 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls podName:dca80132-2417-4fcf-b18a-e34cee059964 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:45.644372993 +0000 UTC m=+37.207528069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls") pod "dns-default-xpknn" (UID: "dca80132-2417-4fcf-b18a-e34cee059964") : secret "dns-default-metrics-tls" not found May 06 17:10:43.644535 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.644440 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:45.644424389 +0000 UTC m=+37.207579462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : configmap references non-existent config key: service-ca.crt May 06 17:10:43.644535 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.644460 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 06 17:10:43.644535 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.644493 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:45.644483573 +0000 UTC m=+37.207638649 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : secret "router-metrics-certs-default" not found May 06 17:10:43.746720 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.746685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:43.746922 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.746762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:43.746922 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:43.746808 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:43.747031 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.746980 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 06 17:10:43.747085 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.747042 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls podName:3d080dd6-8304-4853-9ab3-bd27a2fdd22a nodeName:}" failed. No retries permitted until 2026-05-06 17:10:45.747024177 +0000 UTC m=+37.310179265 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-9kdcj" (UID: "3d080dd6-8304-4853-9ab3-bd27a2fdd22a") : secret "cluster-monitoring-operator-tls" not found May 06 17:10:43.747216 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.747168 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 06 17:10:43.747216 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.747211 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert podName:3bdb3b38-b44f-4385-b653-2e7de1f5dcbc nodeName:}" failed. No retries permitted until 2026-05-06 17:10:45.747198698 +0000 UTC m=+37.310353771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-d4v6f" (UID: "3bdb3b38-b44f-4385-b653-2e7de1f5dcbc") : secret "networking-console-plugin-cert" not found May 06 17:10:43.747421 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.747276 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:10:43.747421 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:43.747305 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert podName:33943d98-8cd7-499f-b152-50856d1a3e54 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:45.747295257 +0000 UTC m=+37.310450331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert") pod "ingress-canary-8bzjj" (UID: "33943d98-8cd7-499f-b152-50856d1a3e54") : secret "canary-serving-cert" not found May 06 17:10:44.116744 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:44.115839 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4f7af8a-6313-4c92-9c2a-385f8580c399" containerID="aeb2d6c7b5c40b44e9d2b347f404572e63e614d282c58dd34e496701392008c3" exitCode=0 May 06 17:10:44.116744 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:44.115920 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" event={"ID":"e4f7af8a-6313-4c92-9c2a-385f8580c399","Type":"ContainerDied","Data":"aeb2d6c7b5c40b44e9d2b347f404572e63e614d282c58dd34e496701392008c3"} May 06 17:10:44.129859 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:44.129832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qht56" event={"ID":"0d02e3af-24fa-4677-818d-3668647af67f","Type":"ContainerStarted","Data":"a46e3f8d6ee0358b9d626d2cb1de90060ef5299d26f43b2408b03f8608ec7829"} May 06 17:10:44.131068 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:44.131018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lsdnk" event={"ID":"04fcdaac-b196-4dea-a077-864b3ee42652","Type":"ContainerStarted","Data":"6d4551f6ed03fec74e353338b82d42e632f980330c00cc8529f3bdd00fb9ffa0"} May 06 17:10:45.157009 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:45.156222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" event={"ID":"e4f7af8a-6313-4c92-9c2a-385f8580c399","Type":"ContainerStarted","Data":"339a63cef44e6310192facf64b844175f8ea5aaa8235bc8b901d2d349847f4e4"} May 06 17:10:45.672561 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:45.672500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:45.672561 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:45.672551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:45.672811 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:45.672615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:45.672811 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:45.672662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:45.672811 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:45.672712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:45.672965 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.672850 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:49.672832205 +0000 UTC m=+41.235987280 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : configmap references non-existent config key: service-ca.crt May 06 17:10:45.673315 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.673290 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 06 17:10:45.673535 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.673354 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls podName:79528661-fb31-4bfd-9a68-ed4dc761f1b2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:49.673338225 +0000 UTC m=+41.236493302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-46lq7" (UID: "79528661-fb31-4bfd-9a68-ed4dc761f1b2") : secret "samples-operator-tls" not found May 06 17:10:45.673535 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.673415 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:10:45.673535 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.673444 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls podName:dca80132-2417-4fcf-b18a-e34cee059964 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:49.673433592 +0000 UTC m=+41.236588668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls") pod "dns-default-xpknn" (UID: "dca80132-2417-4fcf-b18a-e34cee059964") : secret "dns-default-metrics-tls" not found May 06 17:10:45.673731 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.673610 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:10:45.673731 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.673623 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67944dd655-6wfbc: secret "image-registry-tls" not found May 06 17:10:45.673731 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.673653 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls podName:b613c30a-43d7-453f-af58-b7ba639e475f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:49.673642748 +0000 UTC m=+41.236797822 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls") pod "image-registry-67944dd655-6wfbc" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f") : secret "image-registry-tls" not found May 06 17:10:45.673731 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.673699 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 06 17:10:45.673939 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.673736 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:49.673717077 +0000 UTC m=+41.236872155 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : secret "router-metrics-certs-default" not found May 06 17:10:45.773528 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:45.773461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:45.773800 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:45.773548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:45.773800 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:45.773600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:45.773911 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.773808 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 06 17:10:45.773911 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.773867 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls podName:3d080dd6-8304-4853-9ab3-bd27a2fdd22a nodeName:}" failed. No retries permitted until 2026-05-06 17:10:49.773849514 +0000 UTC m=+41.337004587 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-9kdcj" (UID: "3d080dd6-8304-4853-9ab3-bd27a2fdd22a") : secret "cluster-monitoring-operator-tls" not found May 06 17:10:45.774260 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.774122 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 06 17:10:45.774260 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.774201 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert podName:3bdb3b38-b44f-4385-b653-2e7de1f5dcbc nodeName:}" failed. No retries permitted until 2026-05-06 17:10:49.774183183 +0000 UTC m=+41.337338257 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-d4v6f" (UID: "3bdb3b38-b44f-4385-b653-2e7de1f5dcbc") : secret "networking-console-plugin-cert" not found May 06 17:10:45.774260 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.774214 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:10:45.774260 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:45.774291 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert podName:33943d98-8cd7-499f-b152-50856d1a3e54 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:49.774274772 +0000 UTC m=+41.337429849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert") pod "ingress-canary-8bzjj" (UID: "33943d98-8cd7-499f-b152-50856d1a3e54") : secret "canary-serving-cert" not found May 06 17:10:48.986169 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:48.985937 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7wfpq" podStartSLOduration=9.65704363 podStartE2EDuration="39.985922167s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="2026-05-06 17:10:11.724873508 +0000 UTC m=+3.288028579" lastFinishedPulling="2026-05-06 17:10:42.053752034 +0000 UTC m=+33.616907116" observedRunningTime="2026-05-06 17:10:45.188034018 +0000 UTC m=+36.751189136" watchObservedRunningTime="2026-05-06 17:10:48.985922167 +0000 UTC m=+40.549077258" May 06 17:10:49.716960 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:49.716922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:49.716960 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:49.716962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:49.717212 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:49.717014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:49.717212 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:49.717056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:49.717212 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.717083 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 06 17:10:49.717212 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.717120 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 06 17:10:49.717212 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.717148 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls podName:79528661-fb31-4bfd-9a68-ed4dc761f1b2 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:57.717129711 +0000 UTC m=+49.280284793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-46lq7" (UID: "79528661-fb31-4bfd-9a68-ed4dc761f1b2") : secret "samples-operator-tls" not found May 06 17:10:49.717212 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.717153 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:10:49.717212 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.717167 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:57.717158632 +0000 UTC m=+49.280313706 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : secret "router-metrics-certs-default" not found May 06 17:10:49.717212 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.717155 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:10:49.717212 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.717184 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67944dd655-6wfbc: secret "image-registry-tls" not found May 06 17:10:49.717212 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.717210 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls podName:dca80132-2417-4fcf-b18a-e34cee059964 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:57.717194677 +0000 UTC m=+49.280349747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls") pod "dns-default-xpknn" (UID: "dca80132-2417-4fcf-b18a-e34cee059964") : secret "dns-default-metrics-tls" not found May 06 17:10:49.717734 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.717277 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls podName:b613c30a-43d7-453f-af58-b7ba639e475f nodeName:}" failed. No retries permitted until 2026-05-06 17:10:57.717218529 +0000 UTC m=+49.280373598 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls") pod "image-registry-67944dd655-6wfbc" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f") : secret "image-registry-tls" not found May 06 17:10:49.717734 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:49.717310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:49.717734 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.717409 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:57.717400444 +0000 UTC m=+49.280555514 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : configmap references non-existent config key: service-ca.crt May 06 17:10:49.818108 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:49.818072 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:49.818309 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:49.818139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:49.818309 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:49.818183 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:49.818309 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.818219 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:10:49.818309 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.818291 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert podName:33943d98-8cd7-499f-b152-50856d1a3e54 nodeName:}" failed. No retries permitted until 2026-05-06 17:10:57.818272475 +0000 UTC m=+49.381427545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert") pod "ingress-canary-8bzjj" (UID: "33943d98-8cd7-499f-b152-50856d1a3e54") : secret "canary-serving-cert" not found May 06 17:10:49.818518 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.818319 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 06 17:10:49.818518 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.818342 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 06 17:10:49.818518 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.818395 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert podName:3bdb3b38-b44f-4385-b653-2e7de1f5dcbc nodeName:}" failed. No retries permitted until 2026-05-06 17:10:57.818376307 +0000 UTC m=+49.381531379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-d4v6f" (UID: "3bdb3b38-b44f-4385-b653-2e7de1f5dcbc") : secret "networking-console-plugin-cert" not found May 06 17:10:49.818518 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:49.818415 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls podName:3d080dd6-8304-4853-9ab3-bd27a2fdd22a nodeName:}" failed. No retries permitted until 2026-05-06 17:10:57.818405317 +0000 UTC m=+49.381560386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-9kdcj" (UID: "3d080dd6-8304-4853-9ab3-bd27a2fdd22a") : secret "cluster-monitoring-operator-tls" not found May 06 17:10:57.788279 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:57.788224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:57.788317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.788332 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.788349 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-67944dd655-6wfbc: secret "image-registry-tls" not found May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:57.788395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.788414 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls podName:b613c30a-43d7-453f-af58-b7ba639e475f nodeName:}" failed. No retries permitted until 2026-05-06 17:11:13.788395366 +0000 UTC m=+65.351550436 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls") pod "image-registry-67944dd655-6wfbc" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f") : secret "image-registry-tls" not found May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.788451 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.788492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:13.78847346 +0000 UTC m=+65.351628531 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : configmap references non-existent config key: service-ca.crt May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.788506 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:13.788501412 +0000 UTC m=+65.351656481 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : secret "router-metrics-certs-default" not found May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:57.788535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:57.788554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.788633 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 06 17:10:57.788659 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.788663 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls podName:dca80132-2417-4fcf-b18a-e34cee059964 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:13.788654604 +0000 UTC m=+65.351809673 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls") pod "dns-default-xpknn" (UID: "dca80132-2417-4fcf-b18a-e34cee059964") : secret "dns-default-metrics-tls" not found May 06 17:10:57.789019 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.788672 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 06 17:10:57.789019 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.788730 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls podName:79528661-fb31-4bfd-9a68-ed4dc761f1b2 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:13.788715557 +0000 UTC m=+65.351870627 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-46lq7" (UID: "79528661-fb31-4bfd-9a68-ed4dc761f1b2") : secret "samples-operator-tls" not found May 06 17:10:57.889428 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:57.889399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:10:57.889580 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:57.889448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:10:57.889580 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:57.889484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:10:57.889580 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.889516 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 06 17:10:57.889580 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.889565 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert podName:33943d98-8cd7-499f-b152-50856d1a3e54 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:13.889551388 +0000 UTC m=+65.452706458 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert") pod "ingress-canary-8bzjj" (UID: "33943d98-8cd7-499f-b152-50856d1a3e54") : secret "canary-serving-cert" not found May 06 17:10:57.889727 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.889585 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 06 17:10:57.889727 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.889591 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 06 17:10:57.889727 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.889617 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls podName:3d080dd6-8304-4853-9ab3-bd27a2fdd22a nodeName:}" failed. No retries permitted until 2026-05-06 17:11:13.889607655 +0000 UTC m=+65.452762725 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-9kdcj" (UID: "3d080dd6-8304-4853-9ab3-bd27a2fdd22a") : secret "cluster-monitoring-operator-tls" not found May 06 17:10:57.889727 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:10:57.889632 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert podName:3bdb3b38-b44f-4385-b653-2e7de1f5dcbc nodeName:}" failed. No retries permitted until 2026-05-06 17:11:13.889623786 +0000 UTC m=+65.452778855 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-d4v6f" (UID: "3bdb3b38-b44f-4385-b653-2e7de1f5dcbc") : secret "networking-console-plugin-cert" not found May 06 17:10:59.194624 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.194582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" event={"ID":"8552f8df-c056-4938-9998-2d4462c67e4b","Type":"ContainerStarted","Data":"a27bb5f70fc7449aad366291d7e8ea086d5785c52bbfc85a2a29768fcd3a4b1b"} May 06 17:10:59.195989 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.195962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" event={"ID":"847e4e37-03fb-4158-99da-8a78c4311404","Type":"ContainerStarted","Data":"05e7e8110ba5b429f052f430f0bcbafffb89a55768f7effa3dc0b5eb3dc9ad74"} May 06 17:10:59.197926 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.197898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" event={"ID":"94a34ae8-6335-4d97-84ed-a5c0f421b59a","Type":"ContainerStarted","Data":"f081c3bf4e94403fef75f633ec01ba294bbe639def9824c9df976288804d8727"} May 06 17:10:59.199280 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.199257 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss" event={"ID":"76b87b63-406b-4b3f-80ef-77d34e6a3f8f","Type":"ContainerStarted","Data":"3852171694a0003471521758e6fd4c6f698c7f9775773cc19f9aff8a4ac17ccd"} May 06 17:10:59.200771 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.200739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qht56" event={"ID":"0d02e3af-24fa-4677-818d-3668647af67f","Type":"ContainerStarted","Data":"1d03bfda2a3e3babbf5e28fc7d96b1032fc99315c3e050540444aebafc73d024"} May 06 17:10:59.200898 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.200881 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:10:59.202134 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.202114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" event={"ID":"faf20125-0ae8-4e08-a5c0-c453bf3b3647","Type":"ContainerStarted","Data":"1e91328091dddb9c9fb7f803075537a10fc6514df0d879847db934f7e0f001f2"} May 06 17:10:59.202446 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.202431 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:59.203747 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.203729 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/0.log" May 06 17:10:59.203845 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.203765 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee523985-adee-4039-8a66-2e7b0a68522a" containerID="c1a305f91571f3f0c0236f507aa82a20852b3c47a7d82f3faafacbae78edeead" exitCode=255 May 06 17:10:59.203957 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.203935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" event={"ID":"ee523985-adee-4039-8a66-2e7b0a68522a","Type":"ContainerDied","Data":"c1a305f91571f3f0c0236f507aa82a20852b3c47a7d82f3faafacbae78edeead"} May 06 17:10:59.204043 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.204029 2576 scope.go:117] "RemoveContainer" containerID="c1a305f91571f3f0c0236f507aa82a20852b3c47a7d82f3faafacbae78edeead" May 06 17:10:59.204330 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.204311 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" May 06 17:10:59.205629 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.205605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-lsdnk" event={"ID":"04fcdaac-b196-4dea-a077-864b3ee42652","Type":"ContainerStarted","Data":"a212e04d8e16c846cabec7ad634c06a5935ba82f2a181afc9d108e21245717cf"} May 06 17:10:59.207085 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.206955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-544c98cc96-44vjf" event={"ID":"a2139d10-b7a2-47a8-8697-edc7d34841c7","Type":"ContainerStarted","Data":"56efad8ace09d81d0add09d6557c5c903b9d67294e39c7aa52f84f99d85c1031"} May 06 17:10:59.208417 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.208392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" event={"ID":"e176dc4b-5512-4a3f-b240-34431b23770c","Type":"ContainerStarted","Data":"43306d9ae5bd2aee2248e54655efd43411532a239f15bc02dc1488dfc5d60353"} May 06 17:10:59.210186 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.210167 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5" event={"ID":"0a0d2aea-3f9f-492b-82e1-39fb30f63a09","Type":"ContainerStarted","Data":"06799679a41b4811411ef132395654f428b0fa7a00fecb3b1f5ad93517faef41"} May 06 17:10:59.214559 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.214517 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-65967864cb-psj8k" podStartSLOduration=16.307182795 podStartE2EDuration="32.214503376s" podCreationTimestamp="2026-05-06 17:10:27 +0000 UTC" firstStartedPulling="2026-05-06 17:10:42.547353683 +0000 UTC m=+34.110508758" lastFinishedPulling="2026-05-06 17:10:58.454674265 +0000 UTC m=+50.017829339" observedRunningTime="2026-05-06 17:10:59.213322053 +0000 UTC m=+50.776477145" watchObservedRunningTime="2026-05-06 17:10:59.214503376 +0000 UTC m=+50.777658469" May 06 17:10:59.229300 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.229259 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-w78ss" podStartSLOduration=24.590348142 podStartE2EDuration="40.229220302s" podCreationTimestamp="2026-05-06 17:10:19 +0000 UTC" firstStartedPulling="2026-05-06 17:10:42.557408701 +0000 UTC m=+34.120563777" lastFinishedPulling="2026-05-06 17:10:58.196280865 +0000 UTC m=+49.759435937" observedRunningTime="2026-05-06 17:10:59.228460019 +0000 UTC m=+50.791615111" watchObservedRunningTime="2026-05-06 17:10:59.229220302 +0000 UTC m=+50.792375395" May 06 17:10:59.247161 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.247123 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-544c98cc96-44vjf" podStartSLOduration=24.392660345 podStartE2EDuration="40.247110827s" podCreationTimestamp="2026-05-06 17:10:19 +0000 UTC" firstStartedPulling="2026-05-06 17:10:42.564400687 +0000 UTC m=+34.127555771" lastFinishedPulling="2026-05-06 17:10:58.418851169 +0000 UTC m=+49.982006253" observedRunningTime="2026-05-06 17:10:59.24629883 +0000 UTC m=+50.809453923" watchObservedRunningTime="2026-05-06 17:10:59.247110827 +0000 UTC m=+50.810265920" May 06 17:10:59.268956 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.268918 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5865b48697-m64cr" podStartSLOduration=16.372029333 podStartE2EDuration="32.268904269s" podCreationTimestamp="2026-05-06 17:10:27 +0000 UTC" firstStartedPulling="2026-05-06 17:10:42.559556834 +0000 UTC m=+34.122711907" lastFinishedPulling="2026-05-06 17:10:58.456431771 +0000 UTC m=+50.019586843" observedRunningTime="2026-05-06 17:10:59.26764549 +0000 UTC m=+50.830800583" watchObservedRunningTime="2026-05-06 17:10:59.268904269 +0000 UTC m=+50.832059361" May 06 17:10:59.291175 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.291132 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qht56" podStartSLOduration=34.919325792 podStartE2EDuration="50.291116482s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="2026-05-06 17:10:43.083455729 +0000 UTC m=+34.646610802" lastFinishedPulling="2026-05-06 17:10:58.455246407 +0000 UTC m=+50.018401492" observedRunningTime="2026-05-06 17:10:59.289841494 +0000 UTC m=+50.852996586" watchObservedRunningTime="2026-05-06 17:10:59.291116482 +0000 UTC m=+50.854271578" May 06 17:10:59.342221 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.342170 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" podStartSLOduration=24.449090785 podStartE2EDuration="40.342152857s" podCreationTimestamp="2026-05-06 17:10:19 +0000 UTC" firstStartedPulling="2026-05-06 17:10:42.563315836 +0000 UTC m=+34.126470910" lastFinishedPulling="2026-05-06 17:10:58.456377899 +0000 UTC m=+50.019532982" observedRunningTime="2026-05-06 17:10:59.316222402 +0000 UTC m=+50.879377494" watchObservedRunningTime="2026-05-06 17:10:59.342152857 +0000 UTC m=+50.905307949" May 06 17:10:59.371304 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.371259 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-lsdnk" podStartSLOduration=17.386662724 podStartE2EDuration="32.37121535s" podCreationTimestamp="2026-05-06 17:10:27 +0000 UTC" firstStartedPulling="2026-05-06 17:10:43.471521632 +0000 UTC m=+35.034676704" lastFinishedPulling="2026-05-06 17:10:58.456074253 +0000 UTC m=+50.019229330" observedRunningTime="2026-05-06 17:10:59.368954564 +0000 UTC m=+50.932109656" watchObservedRunningTime="2026-05-06 17:10:59.37121535 +0000 UTC m=+50.934370422" May 06 17:10:59.371570 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.371522 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" podStartSLOduration=24.478634545 podStartE2EDuration="40.371513162s" podCreationTimestamp="2026-05-06 17:10:19 +0000 UTC" firstStartedPulling="2026-05-06 17:10:42.562132051 +0000 UTC m=+34.125287124" lastFinishedPulling="2026-05-06 17:10:58.455010671 +0000 UTC m=+50.018165741" observedRunningTime="2026-05-06 17:10:59.342425042 +0000 UTC m=+50.905580126" watchObservedRunningTime="2026-05-06 17:10:59.371513162 +0000 UTC m=+50.934668254" May 06 17:10:59.434422 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:10:59.433913 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-6859b67c86-lz2h5" podStartSLOduration=24.677269674 podStartE2EDuration="40.433896682s" podCreationTimestamp="2026-05-06 17:10:19 +0000 UTC" firstStartedPulling="2026-05-06 17:10:42.697671589 +0000 UTC m=+34.260826661" lastFinishedPulling="2026-05-06 17:10:58.454298588 +0000 UTC m=+50.017453669" observedRunningTime="2026-05-06 17:10:59.433508729 +0000 UTC m=+50.996663821" watchObservedRunningTime="2026-05-06 17:10:59.433896682 +0000 UTC m=+50.997051776" May 06 17:11:00.216037 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:00.216005 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:11:00.216501 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:00.216473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/0.log" May 06 17:11:00.216567 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:00.216505 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee523985-adee-4039-8a66-2e7b0a68522a" containerID="062cafca8723cecffbb27ba60d236f2f8f4d034770da9ae772d2c7bd708cc595" exitCode=255 May 06 17:11:00.217196 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:00.217086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" event={"ID":"ee523985-adee-4039-8a66-2e7b0a68522a","Type":"ContainerDied","Data":"062cafca8723cecffbb27ba60d236f2f8f4d034770da9ae772d2c7bd708cc595"} May 06 17:11:00.217917 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:00.217320 2576 scope.go:117] "RemoveContainer" containerID="c1a305f91571f3f0c0236f507aa82a20852b3c47a7d82f3faafacbae78edeead" May 06 17:11:00.218854 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:00.218836 2576 scope.go:117] "RemoveContainer" containerID="062cafca8723cecffbb27ba60d236f2f8f4d034770da9ae772d2c7bd708cc595" May 06 17:11:00.219059 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:11:00.219035 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-77758f4558-6vwjj_openshift-console-operator(ee523985-adee-4039-8a66-2e7b0a68522a)\"" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" podUID="ee523985-adee-4039-8a66-2e7b0a68522a" May 06 17:11:01.220753 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:01.220681 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:11:01.221176 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:01.221075 2576 scope.go:117] "RemoveContainer" containerID="062cafca8723cecffbb27ba60d236f2f8f4d034770da9ae772d2c7bd708cc595" May 06 17:11:01.221330 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:11:01.221310 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-77758f4558-6vwjj_openshift-console-operator(ee523985-adee-4039-8a66-2e7b0a68522a)\"" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" podUID="ee523985-adee-4039-8a66-2e7b0a68522a" May 06 17:11:01.222736 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:01.222707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" event={"ID":"8552f8df-c056-4938-9998-2d4462c67e4b","Type":"ContainerStarted","Data":"e4b74935d957b7cf31e0fb35bf4ea2e7f3c023c3ca177b8c119bd1da89f00745"} May 06 17:11:01.222736 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:01.222746 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" event={"ID":"8552f8df-c056-4938-9998-2d4462c67e4b","Type":"ContainerStarted","Data":"914c11a3c67e26e354556947f125c75c6b95cdbc1d462033bdd70ca0e9e444e6"} May 06 17:11:01.264030 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:01.263982 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" podStartSLOduration=15.882609103 podStartE2EDuration="34.263965702s" podCreationTimestamp="2026-05-06 17:10:27 +0000 UTC" firstStartedPulling="2026-05-06 17:10:42.546437257 +0000 UTC m=+34.109592326" lastFinishedPulling="2026-05-06 17:11:00.927793852 +0000 UTC m=+52.490948925" observedRunningTime="2026-05-06 17:11:01.262035964 +0000 UTC m=+52.825191055" watchObservedRunningTime="2026-05-06 17:11:01.263965702 +0000 UTC m=+52.827120794" May 06 17:11:01.406524 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:01.406492 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6jd7q_babb97ac-5bf7-447e-9b34-f306f1a7d566/dns-node-resolver/0.log" May 06 17:11:02.006655 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:02.006631 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-85dtq_9ed404eb-a555-4ae7-b728-791f9d60c831/node-ca/0.log" May 06 17:11:02.350013 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:02.349985 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:11:02.350013 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:02.350016 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:11:02.350553 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:02.350422 2576 scope.go:117] "RemoveContainer" containerID="062cafca8723cecffbb27ba60d236f2f8f4d034770da9ae772d2c7bd708cc595" May 06 17:11:02.350638 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:11:02.350618 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-77758f4558-6vwjj_openshift-console-operator(ee523985-adee-4039-8a66-2e7b0a68522a)\"" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" podUID="ee523985-adee-4039-8a66-2e7b0a68522a" May 06 17:11:08.085348 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:08.085322 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bbxnx" May 06 17:11:13.827467 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.827425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:11:13.827467 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.827483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:11:13.827991 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.827656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:11:13.827991 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.827755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:11:13.827991 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.827777 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:11:13.827991 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:11:13.827856 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle podName:0baf8df0-cec6-4632-8692-64dcfb8359a0 nodeName:}" failed. No retries permitted until 2026-05-06 17:11:45.827830479 +0000 UTC m=+97.390985569 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle") pod "router-default-b6fc5dcc6-wptnn" (UID: "0baf8df0-cec6-4632-8692-64dcfb8359a0") : configmap references non-existent config key: service-ca.crt May 06 17:11:13.830774 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.830747 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/dca80132-2417-4fcf-b18a-e34cee059964-metrics-tls\") pod \"dns-default-xpknn\" (UID: \"dca80132-2417-4fcf-b18a-e34cee059964\") " pod="openshift-dns/dns-default-xpknn" May 06 17:11:13.830896 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.830870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0baf8df0-cec6-4632-8692-64dcfb8359a0-metrics-certs\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:11:13.830896 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.830882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") pod \"image-registry-67944dd655-6wfbc\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:11:13.831008 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.830894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/79528661-fb31-4bfd-9a68-ed4dc761f1b2-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-46lq7\" (UID: \"79528661-fb31-4bfd-9a68-ed4dc761f1b2\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:11:13.929064 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.929026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:11:13.929264 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.929081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:11:13.929264 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.929116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:11:13.929264 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:11:13.929252 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 06 17:11:13.929264 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:11:13.929262 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 06 17:11:13.929486 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:11:13.929338 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls podName:3d080dd6-8304-4853-9ab3-bd27a2fdd22a nodeName:}" failed. No retries permitted until 2026-05-06 17:11:45.929307984 +0000 UTC m=+97.492463055 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-9kdcj" (UID: "3d080dd6-8304-4853-9ab3-bd27a2fdd22a") : secret "cluster-monitoring-operator-tls" not found May 06 17:11:13.929486 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:11:13.929359 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert podName:3bdb3b38-b44f-4385-b653-2e7de1f5dcbc nodeName:}" failed. No retries permitted until 2026-05-06 17:11:45.929349484 +0000 UTC m=+97.492504561 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-d4v6f" (UID: "3bdb3b38-b44f-4385-b653-2e7de1f5dcbc") : secret "networking-console-plugin-cert" not found May 06 17:11:13.931563 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.931536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33943d98-8cd7-499f-b152-50856d1a3e54-cert\") pod \"ingress-canary-8bzjj\" (UID: \"33943d98-8cd7-499f-b152-50856d1a3e54\") " pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:11:13.946616 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.946593 2576 scope.go:117] "RemoveContainer" containerID="062cafca8723cecffbb27ba60d236f2f8f4d034770da9ae772d2c7bd708cc595" May 06 17:11:13.962561 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.962542 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7cpjk\"" May 06 17:11:13.970514 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.970493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:11:13.975642 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.975624 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hfv2w\"" May 06 17:11:13.983675 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:13.983652 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xpknn" May 06 17:11:14.045996 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.045794 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-5rpn4\"" May 06 17:11:14.055333 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.055307 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" May 06 17:11:14.102057 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.102000 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m5w6t\"" May 06 17:11:14.110964 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.110193 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8bzjj" May 06 17:11:14.132332 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.132154 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-67944dd655-6wfbc"] May 06 17:11:14.152699 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.152670 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xpknn"] May 06 17:11:14.156746 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:11:14.156262 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddca80132_2417_4fcf_b18a_e34cee059964.slice/crio-6745b8b9cb91d6a0a4363d0ab1858a97c16bcc1181afe093b13614b3c2d21b4e WatchSource:0}: Error finding container 6745b8b9cb91d6a0a4363d0ab1858a97c16bcc1181afe093b13614b3c2d21b4e: Status 404 returned error can't find the container with id 6745b8b9cb91d6a0a4363d0ab1858a97c16bcc1181afe093b13614b3c2d21b4e May 06 17:11:14.215384 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.215337 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7"] May 06 17:11:14.260324 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.260295 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:11:14.260452 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.260386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" event={"ID":"ee523985-adee-4039-8a66-2e7b0a68522a","Type":"ContainerStarted","Data":"a3a7194b5b083a0d3ad31d41d4f02c03c6304f8f71361a12d03140aa63cc2136"} May 06 17:11:14.260692 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.260655 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:11:14.261508 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.261481 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" event={"ID":"b613c30a-43d7-453f-af58-b7ba639e475f","Type":"ContainerStarted","Data":"ee06592e4d0ff02c355e8f2aec08fb03f633fd33e4222cf5b1665624152cdb77"} May 06 17:11:14.262553 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.262533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xpknn" event={"ID":"dca80132-2417-4fcf-b18a-e34cee059964","Type":"ContainerStarted","Data":"6745b8b9cb91d6a0a4363d0ab1858a97c16bcc1181afe093b13614b3c2d21b4e"} May 06 17:11:14.271434 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.271413 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8bzjj"] May 06 17:11:14.273769 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:11:14.273748 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33943d98_8cd7_499f_b152_50856d1a3e54.slice/crio-7f9bb92e2654202305ee1925715b134e6f2210183728e532ceefaffe58d32c77 WatchSource:0}: Error finding container 7f9bb92e2654202305ee1925715b134e6f2210183728e532ceefaffe58d32c77: Status 404 returned error can't find the container with id 7f9bb92e2654202305ee1925715b134e6f2210183728e532ceefaffe58d32c77 May 06 17:11:14.284666 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.284618 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" podStartSLOduration=39.527774396 podStartE2EDuration="55.28460389s" podCreationTimestamp="2026-05-06 17:10:19 +0000 UTC" firstStartedPulling="2026-05-06 17:10:42.698047908 +0000 UTC m=+34.261202979" lastFinishedPulling="2026-05-06 17:10:58.454877388 +0000 UTC m=+50.018032473" observedRunningTime="2026-05-06 17:11:14.283351226 +0000 UTC m=+65.846506322" watchObservedRunningTime="2026-05-06 17:11:14.28460389 +0000 UTC m=+65.847758983" May 06 17:11:14.622006 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.621977 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-77758f4558-6vwjj" May 06 17:11:14.637605 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.637573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:11:14.640977 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:14.640936 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" May 06 17:11:14.648699 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:11:14.648673 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found May 06 17:11:14.648845 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:11:14.648752 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs podName:2c28b880-a50d-4878-bf4e-20dc0f464cc2 nodeName:}" failed. No retries permitted until 2026-05-06 17:12:18.648730812 +0000 UTC m=+130.211885898 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs") pod "network-metrics-daemon-mvgqp" (UID: "2c28b880-a50d-4878-bf4e-20dc0f464cc2") : secret "metrics-daemon-secret" not found May 06 17:11:15.268794 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:15.268752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8bzjj" event={"ID":"33943d98-8cd7-499f-b152-50856d1a3e54","Type":"ContainerStarted","Data":"7f9bb92e2654202305ee1925715b134e6f2210183728e532ceefaffe58d32c77"} May 06 17:11:15.270638 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:15.270601 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" event={"ID":"b613c30a-43d7-453f-af58-b7ba639e475f","Type":"ContainerStarted","Data":"bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6"} May 06 17:11:15.271013 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:15.270977 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:11:15.272425 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:15.272383 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" event={"ID":"79528661-fb31-4bfd-9a68-ed4dc761f1b2","Type":"ContainerStarted","Data":"f49a0af72030f6d62e4dd729c9ec8a0c21f3e998bcfe0f364ee88f304c428736"} May 06 17:11:15.294114 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:15.292964 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" podStartSLOduration=66.292946903 podStartE2EDuration="1m6.292946903s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:11:15.291545987 +0000 UTC m=+66.854701105" watchObservedRunningTime="2026-05-06 17:11:15.292946903 +0000 UTC m=+66.856101996" May 06 17:11:18.281185 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:18.281150 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" event={"ID":"79528661-fb31-4bfd-9a68-ed4dc761f1b2","Type":"ContainerStarted","Data":"56b4b1a923a4f72c8bf5c451b325d90cf35f5d4a85122f46bacd25592d420257"} May 06 17:11:18.281185 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:18.281188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" event={"ID":"79528661-fb31-4bfd-9a68-ed4dc761f1b2","Type":"ContainerStarted","Data":"dc2da0eaee937fdd2160aab806bb63d548948eed2d4443e8bd39cb71c06f2ac4"} May 06 17:11:18.282742 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:18.282721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xpknn" event={"ID":"dca80132-2417-4fcf-b18a-e34cee059964","Type":"ContainerStarted","Data":"908f903af037f4e9cbef9f80415f65400cae7a07fd550276c27e20065d8b54e4"} May 06 17:11:18.282742 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:18.282745 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xpknn" event={"ID":"dca80132-2417-4fcf-b18a-e34cee059964","Type":"ContainerStarted","Data":"5b9c2571c96d2f832bf96f63dc10bfce531674f2a48cc0c53ef62b02df6b4108"} May 06 17:11:18.282888 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:18.282851 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xpknn" May 06 17:11:18.284026 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:18.284005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8bzjj" event={"ID":"33943d98-8cd7-499f-b152-50856d1a3e54","Type":"ContainerStarted","Data":"b269d19cfcce6461e5aa5fddefcf42c17850bb7ef3842c2bc33e1c1ad795de88"} May 06 17:11:18.306487 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:18.306431 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-46lq7" podStartSLOduration=55.874685832 podStartE2EDuration="59.306415374s" podCreationTimestamp="2026-05-06 17:10:19 +0000 UTC" firstStartedPulling="2026-05-06 17:11:14.283155537 +0000 UTC m=+65.846310617" lastFinishedPulling="2026-05-06 17:11:17.714885089 +0000 UTC m=+69.278040159" observedRunningTime="2026-05-06 17:11:18.306365736 +0000 UTC m=+69.869520831" watchObservedRunningTime="2026-05-06 17:11:18.306415374 +0000 UTC m=+69.869570464" May 06 17:11:18.335756 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:18.335708 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8bzjj" podStartSLOduration=33.90127838 podStartE2EDuration="37.335693648s" podCreationTimestamp="2026-05-06 17:10:41 +0000 UTC" firstStartedPulling="2026-05-06 17:11:14.275877846 +0000 UTC m=+65.839032935" lastFinishedPulling="2026-05-06 17:11:17.710293122 +0000 UTC m=+69.273448203" observedRunningTime="2026-05-06 17:11:18.334601036 +0000 UTC m=+69.897756140" watchObservedRunningTime="2026-05-06 17:11:18.335693648 +0000 UTC m=+69.898848789" May 06 17:11:18.352620 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:18.352570 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xpknn" podStartSLOduration=33.801209161 podStartE2EDuration="37.352554942s" podCreationTimestamp="2026-05-06 17:10:41 +0000 UTC" firstStartedPulling="2026-05-06 17:11:14.158944106 +0000 UTC m=+65.722099176" lastFinishedPulling="2026-05-06 17:11:17.710289887 +0000 UTC m=+69.273444957" observedRunningTime="2026-05-06 17:11:18.352111807 +0000 UTC m=+69.915266899" watchObservedRunningTime="2026-05-06 17:11:18.352554942 +0000 UTC m=+69.915710037" May 06 17:11:22.037132 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.037096 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-24dsh"] May 06 17:11:22.056499 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.056460 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-24dsh"] May 06 17:11:22.056669 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.056583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.058749 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.058725 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" May 06 17:11:22.058749 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.058731 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" May 06 17:11:22.059450 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.059430 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nrl57\"" May 06 17:11:22.204711 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.204608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/55c3824a-49d2-4a99-aec3-1cf22bb010f4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.204711 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.204641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/55c3824a-49d2-4a99-aec3-1cf22bb010f4-data-volume\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.204711 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.204687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkt9t\" (UniqueName: \"kubernetes.io/projected/55c3824a-49d2-4a99-aec3-1cf22bb010f4-kube-api-access-qkt9t\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.204919 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.204713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/55c3824a-49d2-4a99-aec3-1cf22bb010f4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.204919 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.204786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/55c3824a-49d2-4a99-aec3-1cf22bb010f4-crio-socket\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.305783 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.305707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/55c3824a-49d2-4a99-aec3-1cf22bb010f4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.305783 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.305784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/55c3824a-49d2-4a99-aec3-1cf22bb010f4-crio-socket\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.305971 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.305823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/55c3824a-49d2-4a99-aec3-1cf22bb010f4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.305971 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.305842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/55c3824a-49d2-4a99-aec3-1cf22bb010f4-data-volume\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.305971 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.305917 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/55c3824a-49d2-4a99-aec3-1cf22bb010f4-crio-socket\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.306091 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.305974 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkt9t\" (UniqueName: \"kubernetes.io/projected/55c3824a-49d2-4a99-aec3-1cf22bb010f4-kube-api-access-qkt9t\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.306172 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.306158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/55c3824a-49d2-4a99-aec3-1cf22bb010f4-data-volume\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.306419 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.306401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/55c3824a-49d2-4a99-aec3-1cf22bb010f4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.308342 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.308322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/55c3824a-49d2-4a99-aec3-1cf22bb010f4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.318655 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.318627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkt9t\" (UniqueName: \"kubernetes.io/projected/55c3824a-49d2-4a99-aec3-1cf22bb010f4-kube-api-access-qkt9t\") pod \"insights-runtime-extractor-24dsh\" (UID: \"55c3824a-49d2-4a99-aec3-1cf22bb010f4\") " pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.366130 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.366099 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-24dsh" May 06 17:11:22.503143 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:22.503105 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-24dsh"] May 06 17:11:22.506666 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:11:22.506628 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55c3824a_49d2_4a99_aec3_1cf22bb010f4.slice/crio-ef9dab7a196fc73017df9a115845c52f1f74f83b11d5f8adf7af9efd36b44510 WatchSource:0}: Error finding container ef9dab7a196fc73017df9a115845c52f1f74f83b11d5f8adf7af9efd36b44510: Status 404 returned error can't find the container with id ef9dab7a196fc73017df9a115845c52f1f74f83b11d5f8adf7af9efd36b44510 May 06 17:11:23.300484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:23.300446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-24dsh" event={"ID":"55c3824a-49d2-4a99-aec3-1cf22bb010f4","Type":"ContainerStarted","Data":"4bac58de1d1846c2024afe81f1d09c81ff507532d534a87a910683e976255260"} May 06 17:11:23.300484 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:23.300486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-24dsh" event={"ID":"55c3824a-49d2-4a99-aec3-1cf22bb010f4","Type":"ContainerStarted","Data":"ef9dab7a196fc73017df9a115845c52f1f74f83b11d5f8adf7af9efd36b44510"} May 06 17:11:24.305455 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:24.305420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-24dsh" event={"ID":"55c3824a-49d2-4a99-aec3-1cf22bb010f4","Type":"ContainerStarted","Data":"83673e6d0f422a93bbb8b0ccf40d9ac784d85b32be381753ea3af378a0ecfabd"} May 06 17:11:26.312053 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:26.312018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-24dsh" event={"ID":"55c3824a-49d2-4a99-aec3-1cf22bb010f4","Type":"ContainerStarted","Data":"c97f0bdfcbfcadf5dcd9503c93ad758e75c9e6793196514aaa2d7a524c964aaa"} May 06 17:11:26.331122 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:26.331071 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-24dsh" podStartSLOduration=1.448493022 podStartE2EDuration="4.331056059s" podCreationTimestamp="2026-05-06 17:11:22 +0000 UTC" firstStartedPulling="2026-05-06 17:11:22.651865531 +0000 UTC m=+74.215020601" lastFinishedPulling="2026-05-06 17:11:25.534428565 +0000 UTC m=+77.097583638" observedRunningTime="2026-05-06 17:11:26.329564085 +0000 UTC m=+77.892719178" watchObservedRunningTime="2026-05-06 17:11:26.331056059 +0000 UTC m=+77.894211195" May 06 17:11:28.288967 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:28.288927 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xpknn" May 06 17:11:30.219645 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:30.219612 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qht56" May 06 17:11:33.975454 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:33.975417 2576 patch_prober.go:28] interesting pod/image-registry-67944dd655-6wfbc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} May 06 17:11:33.976114 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:33.975471 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" podUID="b613c30a-43d7-453f-af58-b7ba639e475f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" May 06 17:11:36.279092 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:36.279061 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:11:44.412633 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:44.412600 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67944dd655-6wfbc"] May 06 17:11:45.885621 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:45.885584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:11:45.886141 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:45.886123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0baf8df0-cec6-4632-8692-64dcfb8359a0-service-ca-bundle\") pod \"router-default-b6fc5dcc6-wptnn\" (UID: \"0baf8df0-cec6-4632-8692-64dcfb8359a0\") " pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:11:45.986546 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:45.986512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:11:45.986546 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:45.986551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:11:45.988896 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:45.988863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d080dd6-8304-4853-9ab3-bd27a2fdd22a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-9kdcj\" (UID: \"3d080dd6-8304-4853-9ab3-bd27a2fdd22a\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:11:45.989007 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:45.988905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3bdb3b38-b44f-4385-b653-2e7de1f5dcbc-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-d4v6f\" (UID: \"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc\") " pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:11:46.092579 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.092554 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-nz47m\"" May 06 17:11:46.101461 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.101443 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:11:46.157912 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.157882 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-z7qkg\"" May 06 17:11:46.166721 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.166696 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" May 06 17:11:46.236584 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.236557 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-b6fc5dcc6-wptnn"] May 06 17:11:46.239060 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.239040 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zvxkm\"" May 06 17:11:46.239605 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:11:46.239579 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0baf8df0_cec6_4632_8692_64dcfb8359a0.slice/crio-2afd72a22e04ad96c8d01c8b6bc129e97a587b3cb261637cd3caba04274cc9f9 WatchSource:0}: Error finding container 2afd72a22e04ad96c8d01c8b6bc129e97a587b3cb261637cd3caba04274cc9f9: Status 404 returned error can't find the container with id 2afd72a22e04ad96c8d01c8b6bc129e97a587b3cb261637cd3caba04274cc9f9 May 06 17:11:46.247666 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.247641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" May 06 17:11:46.301745 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.301718 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj"] May 06 17:11:46.305893 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:11:46.305579 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d080dd6_8304_4853_9ab3_bd27a2fdd22a.slice/crio-f643f343703ef316e859e279847e7247c60e65c9a64d6449b4decf2219515adf WatchSource:0}: Error finding container f643f343703ef316e859e279847e7247c60e65c9a64d6449b4decf2219515adf: Status 404 returned error can't find the container with id f643f343703ef316e859e279847e7247c60e65c9a64d6449b4decf2219515adf May 06 17:11:46.368818 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.368772 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" event={"ID":"0baf8df0-cec6-4632-8692-64dcfb8359a0","Type":"ContainerStarted","Data":"3a543b5b885d83092f724e44794d670862a0e5780c59989e63d788d4fd4b9393"} May 06 17:11:46.368818 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.368821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" event={"ID":"0baf8df0-cec6-4632-8692-64dcfb8359a0","Type":"ContainerStarted","Data":"2afd72a22e04ad96c8d01c8b6bc129e97a587b3cb261637cd3caba04274cc9f9"} May 06 17:11:46.369933 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.369904 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" event={"ID":"3d080dd6-8304-4853-9ab3-bd27a2fdd22a","Type":"ContainerStarted","Data":"f643f343703ef316e859e279847e7247c60e65c9a64d6449b4decf2219515adf"} May 06 17:11:46.389048 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.389009 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" podStartSLOduration=87.388993645 podStartE2EDuration="1m27.388993645s" podCreationTimestamp="2026-05-06 17:10:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:11:46.388316477 +0000 UTC m=+97.951471568" watchObservedRunningTime="2026-05-06 17:11:46.388993645 +0000 UTC m=+97.952148729" May 06 17:11:46.391795 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:46.391777 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-697665887d-d4v6f"] May 06 17:11:46.393351 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:11:46.393326 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bdb3b38_b44f_4385_b653_2e7de1f5dcbc.slice/crio-887030cf332c969853526f6c24dc35032a5de954525a1ca77bec78d596e228e8 WatchSource:0}: Error finding container 887030cf332c969853526f6c24dc35032a5de954525a1ca77bec78d596e228e8: Status 404 returned error can't find the container with id 887030cf332c969853526f6c24dc35032a5de954525a1ca77bec78d596e228e8 May 06 17:11:47.102536 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:47.102461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:11:47.105576 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:47.105554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:11:47.373638 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:47.373558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" event={"ID":"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc","Type":"ContainerStarted","Data":"887030cf332c969853526f6c24dc35032a5de954525a1ca77bec78d596e228e8"} May 06 17:11:47.373850 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:47.373817 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:11:47.375091 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:47.375067 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-b6fc5dcc6-wptnn" May 06 17:11:49.381124 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:49.381079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" event={"ID":"3bdb3b38-b44f-4385-b653-2e7de1f5dcbc","Type":"ContainerStarted","Data":"c20dbe7e714d0356bb97a9dbc03eeda33a842b2aebc3dedd6ff74afa1ab58ff6"} May 06 17:11:49.382455 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:49.382429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" event={"ID":"3d080dd6-8304-4853-9ab3-bd27a2fdd22a","Type":"ContainerStarted","Data":"2a24baaa78d86735d56b51347a1e4f2430027662d82cf326d5bdb4d75335db03"} May 06 17:11:49.398581 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:49.398540 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-697665887d-d4v6f" podStartSLOduration=68.105146641 podStartE2EDuration="1m10.398530059s" podCreationTimestamp="2026-05-06 17:10:39 +0000 UTC" firstStartedPulling="2026-05-06 17:11:46.395063892 +0000 UTC m=+97.958218963" lastFinishedPulling="2026-05-06 17:11:48.688447306 +0000 UTC m=+100.251602381" observedRunningTime="2026-05-06 17:11:49.397431748 +0000 UTC m=+100.960586839" watchObservedRunningTime="2026-05-06 17:11:49.398530059 +0000 UTC m=+100.961685143" May 06 17:11:49.415689 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:49.415633 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-9kdcj" podStartSLOduration=88.031753116 podStartE2EDuration="1m30.415618736s" podCreationTimestamp="2026-05-06 17:10:19 +0000 UTC" firstStartedPulling="2026-05-06 17:11:46.30818809 +0000 UTC m=+97.871343161" lastFinishedPulling="2026-05-06 17:11:48.692053695 +0000 UTC m=+100.255208781" observedRunningTime="2026-05-06 17:11:49.414599636 +0000 UTC m=+100.977754730" watchObservedRunningTime="2026-05-06 17:11:49.415618736 +0000 UTC m=+100.978774029" May 06 17:11:58.727874 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.727833 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2xccj"] May 06 17:11:58.732538 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.732517 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.737156 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.737131 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" May 06 17:11:58.737435 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.737418 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" May 06 17:11:58.737517 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.737480 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" May 06 17:11:58.737711 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.737688 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7k9wf\"" May 06 17:11:58.737823 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.737806 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" May 06 17:11:58.790547 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.790510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-sys\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.790547 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.790546 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-root\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.790727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.790638 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-metrics-client-ca\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.790727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.790668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-tls\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.790727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.790690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-accelerators-collector-config\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.790727 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.790710 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-wtmp\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.790852 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.790760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn64g\" (UniqueName: \"kubernetes.io/projected/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-kube-api-access-jn64g\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.790852 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.790808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.790852 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.790831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-textfile\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891382 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-metrics-client-ca\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891382 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-tls\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891622 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-accelerators-collector-config\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891622 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-wtmp\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891622 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jn64g\" (UniqueName: \"kubernetes.io/projected/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-kube-api-access-jn64g\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891622 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891622 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-textfile\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891622 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-sys\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891622 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-root\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891876 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-root\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891876 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-wtmp\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.891876 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.891684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-sys\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.892100 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.892073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-textfile\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.892220 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.892172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-accelerators-collector-config\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.892317 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.892303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-metrics-client-ca\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.894074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.894047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-tls\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.894074 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.894063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:58.900661 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:58.900641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn64g\" (UniqueName: \"kubernetes.io/projected/5ba6b606-5fa6-4eb9-a5e9-077c683fddec-kube-api-access-jn64g\") pod \"node-exporter-2xccj\" (UID: \"5ba6b606-5fa6-4eb9-a5e9-077c683fddec\") " pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:59.041894 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:59.041864 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2xccj" May 06 17:11:59.050439 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:11:59.050400 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba6b606_5fa6_4eb9_a5e9_077c683fddec.slice/crio-641f9b34d7f6dda448db1737ef91c36fb7fd4b237e4e47eda78c4830d173956a WatchSource:0}: Error finding container 641f9b34d7f6dda448db1737ef91c36fb7fd4b237e4e47eda78c4830d173956a: Status 404 returned error can't find the container with id 641f9b34d7f6dda448db1737ef91c36fb7fd4b237e4e47eda78c4830d173956a May 06 17:11:59.414920 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:11:59.414839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2xccj" event={"ID":"5ba6b606-5fa6-4eb9-a5e9-077c683fddec","Type":"ContainerStarted","Data":"641f9b34d7f6dda448db1737ef91c36fb7fd4b237e4e47eda78c4830d173956a"} May 06 17:12:00.421551 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:00.421517 2576 generic.go:358] "Generic (PLEG): container finished" podID="5ba6b606-5fa6-4eb9-a5e9-077c683fddec" containerID="8eaee04bc81414f5b3a5a677a977008fd09b7f66284c7d242ce356b53503c3bb" exitCode=0 May 06 17:12:00.421949 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:00.421577 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2xccj" event={"ID":"5ba6b606-5fa6-4eb9-a5e9-077c683fddec","Type":"ContainerDied","Data":"8eaee04bc81414f5b3a5a677a977008fd09b7f66284c7d242ce356b53503c3bb"} May 06 17:12:01.425979 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:01.425941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2xccj" event={"ID":"5ba6b606-5fa6-4eb9-a5e9-077c683fddec","Type":"ContainerStarted","Data":"a54de6df9a2678a2e54e9b425ceffe76138288eab5d3034340f4ecd16a4e9afe"} May 06 17:12:01.425979 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:01.425985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2xccj" event={"ID":"5ba6b606-5fa6-4eb9-a5e9-077c683fddec","Type":"ContainerStarted","Data":"502da9928374deb42ad39a7c7c091b712d36943ba35499ff18e3fcca31be673b"} May 06 17:12:01.465135 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:01.465081 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2xccj" podStartSLOduration=2.505684754 podStartE2EDuration="3.465065862s" podCreationTimestamp="2026-05-06 17:11:58 +0000 UTC" firstStartedPulling="2026-05-06 17:11:59.051862477 +0000 UTC m=+110.615017550" lastFinishedPulling="2026-05-06 17:12:00.011243589 +0000 UTC m=+111.574398658" observedRunningTime="2026-05-06 17:12:01.464043867 +0000 UTC m=+113.027198960" watchObservedRunningTime="2026-05-06 17:12:01.465065862 +0000 UTC m=+113.028220953" May 06 17:12:09.431340 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.431279 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" podUID="b613c30a-43d7-453f-af58-b7ba639e475f" containerName="registry" containerID="cri-o://bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6" gracePeriod=30 May 06 17:12:09.682565 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.682511 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:12:09.784673 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.784638 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b613c30a-43d7-453f-af58-b7ba639e475f-ca-trust-extracted\") pod \"b613c30a-43d7-453f-af58-b7ba639e475f\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " May 06 17:12:09.784851 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.784699 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-installation-pull-secrets\") pod \"b613c30a-43d7-453f-af58-b7ba639e475f\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " May 06 17:12:09.784851 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.784733 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") pod \"b613c30a-43d7-453f-af58-b7ba639e475f\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " May 06 17:12:09.784851 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.784752 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww2sh\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-kube-api-access-ww2sh\") pod \"b613c30a-43d7-453f-af58-b7ba639e475f\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " May 06 17:12:09.784851 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.784774 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-registry-certificates\") pod \"b613c30a-43d7-453f-af58-b7ba639e475f\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " May 06 17:12:09.784851 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.784795 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-trusted-ca\") pod \"b613c30a-43d7-453f-af58-b7ba639e475f\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " May 06 17:12:09.784851 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.784817 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-image-registry-private-configuration\") pod \"b613c30a-43d7-453f-af58-b7ba639e475f\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " May 06 17:12:09.785157 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.784878 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-bound-sa-token\") pod \"b613c30a-43d7-453f-af58-b7ba639e475f\" (UID: \"b613c30a-43d7-453f-af58-b7ba639e475f\") " May 06 17:12:09.785775 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.785713 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b613c30a-43d7-453f-af58-b7ba639e475f" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:09.785775 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.785725 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b613c30a-43d7-453f-af58-b7ba639e475f" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 06 17:12:09.787525 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.787470 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b613c30a-43d7-453f-af58-b7ba639e475f" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:12:09.787638 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.787592 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-kube-api-access-ww2sh" (OuterVolumeSpecName: "kube-api-access-ww2sh") pod "b613c30a-43d7-453f-af58-b7ba639e475f" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f"). InnerVolumeSpecName "kube-api-access-ww2sh". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:12:09.787738 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.787715 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b613c30a-43d7-453f-af58-b7ba639e475f" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:12:09.787781 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.787769 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b613c30a-43d7-453f-af58-b7ba639e475f" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:12:09.788151 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.788125 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b613c30a-43d7-453f-af58-b7ba639e475f" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 06 17:12:09.796953 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.796925 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b613c30a-43d7-453f-af58-b7ba639e475f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b613c30a-43d7-453f-af58-b7ba639e475f" (UID: "b613c30a-43d7-453f-af58-b7ba639e475f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 06 17:12:09.886426 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.886392 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-registry-certificates\") on node \"ip-10-0-135-110.ec2.internal\" DevicePath \"\"" May 06 17:12:09.886426 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.886420 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b613c30a-43d7-453f-af58-b7ba639e475f-trusted-ca\") on node \"ip-10-0-135-110.ec2.internal\" DevicePath \"\"" May 06 17:12:09.886426 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.886430 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-image-registry-private-configuration\") on node \"ip-10-0-135-110.ec2.internal\" DevicePath \"\"" May 06 17:12:09.886635 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.886440 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-bound-sa-token\") on node \"ip-10-0-135-110.ec2.internal\" DevicePath \"\"" May 06 17:12:09.886635 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.886450 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b613c30a-43d7-453f-af58-b7ba639e475f-ca-trust-extracted\") on node \"ip-10-0-135-110.ec2.internal\" DevicePath \"\"" May 06 17:12:09.886635 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.886458 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b613c30a-43d7-453f-af58-b7ba639e475f-installation-pull-secrets\") on node \"ip-10-0-135-110.ec2.internal\" DevicePath \"\"" May 06 17:12:09.886635 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.886467 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-registry-tls\") on node \"ip-10-0-135-110.ec2.internal\" DevicePath \"\"" May 06 17:12:09.886635 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:09.886475 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ww2sh\" (UniqueName: \"kubernetes.io/projected/b613c30a-43d7-453f-af58-b7ba639e475f-kube-api-access-ww2sh\") on node \"ip-10-0-135-110.ec2.internal\" DevicePath \"\"" May 06 17:12:10.455516 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:10.455482 2576 generic.go:358] "Generic (PLEG): container finished" podID="b613c30a-43d7-453f-af58-b7ba639e475f" containerID="bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6" exitCode=0 May 06 17:12:10.455950 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:10.455530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" event={"ID":"b613c30a-43d7-453f-af58-b7ba639e475f","Type":"ContainerDied","Data":"bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6"} May 06 17:12:10.455950 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:10.455560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" event={"ID":"b613c30a-43d7-453f-af58-b7ba639e475f","Type":"ContainerDied","Data":"ee06592e4d0ff02c355e8f2aec08fb03f633fd33e4222cf5b1665624152cdb77"} May 06 17:12:10.455950 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:10.455569 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-67944dd655-6wfbc" May 06 17:12:10.455950 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:10.455578 2576 scope.go:117] "RemoveContainer" containerID="bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6" May 06 17:12:10.463820 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:10.463801 2576 scope.go:117] "RemoveContainer" containerID="bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6" May 06 17:12:10.464105 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:12:10.464084 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6\": container with ID starting with bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6 not found: ID does not exist" containerID="bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6" May 06 17:12:10.464152 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:10.464113 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6"} err="failed to get container status \"bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6\": rpc error: code = NotFound desc = could not find container \"bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6\": container with ID starting with bf2bcdb1fc4a98c2dd878af1efff4d957440abc7c1c2f0659256438a853c54c6 not found: ID does not exist" May 06 17:12:10.477429 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:10.477406 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-67944dd655-6wfbc"] May 06 17:12:10.482524 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:10.482502 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-67944dd655-6wfbc"] May 06 17:12:10.950632 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:10.950601 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b613c30a-43d7-453f-af58-b7ba639e475f" path="/var/lib/kubelet/pods/b613c30a-43d7-453f-af58-b7ba639e475f/volumes" May 06 17:12:18.663221 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:18.663185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:12:18.665449 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:18.665428 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c28b880-a50d-4878-bf4e-20dc0f464cc2-metrics-certs\") pod \"network-metrics-daemon-mvgqp\" (UID: \"2c28b880-a50d-4878-bf4e-20dc0f464cc2\") " pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:12:18.960677 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:18.960590 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7hgjs\"" May 06 17:12:18.965965 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:18.965945 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mvgqp" May 06 17:12:19.086868 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:19.086846 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mvgqp"] May 06 17:12:19.089327 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:12:19.089301 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c28b880_a50d_4878_bf4e_20dc0f464cc2.slice/crio-6c6476f4b9d83e168472121715a1bace6ecc888dcee94de1d90a319f0b3385fa WatchSource:0}: Error finding container 6c6476f4b9d83e168472121715a1bace6ecc888dcee94de1d90a319f0b3385fa: Status 404 returned error can't find the container with id 6c6476f4b9d83e168472121715a1bace6ecc888dcee94de1d90a319f0b3385fa May 06 17:12:19.479969 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:19.479936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mvgqp" event={"ID":"2c28b880-a50d-4878-bf4e-20dc0f464cc2","Type":"ContainerStarted","Data":"6c6476f4b9d83e168472121715a1bace6ecc888dcee94de1d90a319f0b3385fa"} May 06 17:12:19.481180 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:19.481156 2576 generic.go:358] "Generic (PLEG): container finished" podID="a2139d10-b7a2-47a8-8697-edc7d34841c7" containerID="56efad8ace09d81d0add09d6557c5c903b9d67294e39c7aa52f84f99d85c1031" exitCode=0 May 06 17:12:19.481318 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:19.481246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-544c98cc96-44vjf" event={"ID":"a2139d10-b7a2-47a8-8697-edc7d34841c7","Type":"ContainerDied","Data":"56efad8ace09d81d0add09d6557c5c903b9d67294e39c7aa52f84f99d85c1031"} May 06 17:12:19.481587 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:19.481575 2576 scope.go:117] "RemoveContainer" containerID="56efad8ace09d81d0add09d6557c5c903b9d67294e39c7aa52f84f99d85c1031" May 06 17:12:20.485814 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:20.485763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-544c98cc96-44vjf" event={"ID":"a2139d10-b7a2-47a8-8697-edc7d34841c7","Type":"ContainerStarted","Data":"6364a2d6f68798670a1066361a98423fcd666163a60c13613235626d17fec820"} May 06 17:12:20.487067 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:20.487040 2576 generic.go:358] "Generic (PLEG): container finished" podID="94a34ae8-6335-4d97-84ed-a5c0f421b59a" containerID="f081c3bf4e94403fef75f633ec01ba294bbe639def9824c9df976288804d8727" exitCode=0 May 06 17:12:20.487196 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:20.487088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" event={"ID":"94a34ae8-6335-4d97-84ed-a5c0f421b59a","Type":"ContainerDied","Data":"f081c3bf4e94403fef75f633ec01ba294bbe639def9824c9df976288804d8727"} May 06 17:12:20.487416 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:20.487401 2576 scope.go:117] "RemoveContainer" containerID="f081c3bf4e94403fef75f633ec01ba294bbe639def9824c9df976288804d8727" May 06 17:12:21.491465 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:21.491426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-8z9xm" event={"ID":"94a34ae8-6335-4d97-84ed-a5c0f421b59a","Type":"ContainerStarted","Data":"ae9c6953e731c79b4778845fadbe6095cee6fb3ac54f2d6923b9d684e2ccbfa8"} May 06 17:12:21.492945 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:21.492921 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mvgqp" event={"ID":"2c28b880-a50d-4878-bf4e-20dc0f464cc2","Type":"ContainerStarted","Data":"f8a5823f8fce62f637299255c74812ccdd33250e9febb0a2ce5f9736b4b311c8"} May 06 17:12:21.492945 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:21.492949 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mvgqp" event={"ID":"2c28b880-a50d-4878-bf4e-20dc0f464cc2","Type":"ContainerStarted","Data":"13b3a59f0150aab37cfc1d27210503412c147ce529ccf380093a1fc958fb7c1f"} May 06 17:12:21.525666 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:21.525626 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mvgqp" podStartSLOduration=131.123309913 podStartE2EDuration="2m12.525613654s" podCreationTimestamp="2026-05-06 17:10:09 +0000 UTC" firstStartedPulling="2026-05-06 17:12:19.091271663 +0000 UTC m=+130.654426732" lastFinishedPulling="2026-05-06 17:12:20.493575398 +0000 UTC m=+132.056730473" observedRunningTime="2026-05-06 17:12:21.525195012 +0000 UTC m=+133.088350108" watchObservedRunningTime="2026-05-06 17:12:21.525613654 +0000 UTC m=+133.088768746" May 06 17:12:29.515902 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:29.515815 2576 generic.go:358] "Generic (PLEG): container finished" podID="e176dc4b-5512-4a3f-b240-34431b23770c" containerID="43306d9ae5bd2aee2248e54655efd43411532a239f15bc02dc1488dfc5d60353" exitCode=0 May 06 17:12:29.516248 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:29.515894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" event={"ID":"e176dc4b-5512-4a3f-b240-34431b23770c","Type":"ContainerDied","Data":"43306d9ae5bd2aee2248e54655efd43411532a239f15bc02dc1488dfc5d60353"} May 06 17:12:29.516248 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:29.516175 2576 scope.go:117] "RemoveContainer" containerID="43306d9ae5bd2aee2248e54655efd43411532a239f15bc02dc1488dfc5d60353" May 06 17:12:30.520362 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:30.520328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-d2qsx" event={"ID":"e176dc4b-5512-4a3f-b240-34431b23770c","Type":"ContainerStarted","Data":"7febfa096ddc8649d8627f1f5626e9a3287c87c54d8cc557a4cdc96ccc238949"} May 06 17:12:32.309786 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:32.309748 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" podUID="8552f8df-c056-4938-9998-2d4462c67e4b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" May 06 17:12:42.310091 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:42.310050 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" podUID="8552f8df-c056-4938-9998-2d4462c67e4b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" May 06 17:12:52.310250 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:52.310136 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" podUID="8552f8df-c056-4938-9998-2d4462c67e4b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" May 06 17:12:52.310250 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:52.310202 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" May 06 17:12:52.310734 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:52.310675 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"e4b74935d957b7cf31e0fb35bf4ea2e7f3c023c3ca177b8c119bd1da89f00745"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" containerMessage="Container service-proxy failed liveness probe, will be restarted" May 06 17:12:52.310734 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:52.310711 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" podUID="8552f8df-c056-4938-9998-2d4462c67e4b" containerName="service-proxy" containerID="cri-o://e4b74935d957b7cf31e0fb35bf4ea2e7f3c023c3ca177b8c119bd1da89f00745" gracePeriod=30 May 06 17:12:52.580314 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:52.580211 2576 generic.go:358] "Generic (PLEG): container finished" podID="8552f8df-c056-4938-9998-2d4462c67e4b" containerID="e4b74935d957b7cf31e0fb35bf4ea2e7f3c023c3ca177b8c119bd1da89f00745" exitCode=2 May 06 17:12:52.580314 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:52.580283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" event={"ID":"8552f8df-c056-4938-9998-2d4462c67e4b","Type":"ContainerDied","Data":"e4b74935d957b7cf31e0fb35bf4ea2e7f3c023c3ca177b8c119bd1da89f00745"} May 06 17:12:52.580478 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:12:52.580323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-587f68fff5-zzqnf" event={"ID":"8552f8df-c056-4938-9998-2d4462c67e4b","Type":"ContainerStarted","Data":"ffae6c905b139c6e2d87de5277097655075f1f066ca3fcdfaf029c22c7810da0"} May 06 17:15:08.869172 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:08.869141 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:15:08.870316 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:08.870289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:15:08.877992 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:08.877969 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:15:08.878818 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:08.878797 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:15:08.881132 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:08.881113 2576 kubelet.go:1628] "Image garbage collection succeeded" May 06 17:15:35.438290 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.438252 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf"] May 06 17:15:35.443251 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.442452 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b613c30a-43d7-453f-af58-b7ba639e475f" containerName="registry" May 06 17:15:35.443251 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.442474 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b613c30a-43d7-453f-af58-b7ba639e475f" containerName="registry" May 06 17:15:35.443251 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.442530 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b613c30a-43d7-453f-af58-b7ba639e475f" containerName="registry" May 06 17:15:35.445480 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.445458 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.447909 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.447891 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-spdkz\"" May 06 17:15:35.447909 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.447904 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" May 06 17:15:35.448087 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.447987 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" May 06 17:15:35.448219 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.448194 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" May 06 17:15:35.448219 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.448213 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" May 06 17:15:35.450862 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.450845 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf"] May 06 17:15:35.523796 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.523758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q2q4\" (UniqueName: \"kubernetes.io/projected/f2231f2b-56f6-42f4-a0b6-796f31e660cc-kube-api-access-7q2q4\") pod \"opendatahub-operator-controller-manager-698574c4f-psxkf\" (UID: \"f2231f2b-56f6-42f4-a0b6-796f31e660cc\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.523976 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.523827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2231f2b-56f6-42f4-a0b6-796f31e660cc-webhook-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-psxkf\" (UID: \"f2231f2b-56f6-42f4-a0b6-796f31e660cc\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.523976 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.523852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2231f2b-56f6-42f4-a0b6-796f31e660cc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-psxkf\" (UID: \"f2231f2b-56f6-42f4-a0b6-796f31e660cc\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.625246 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.625195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2231f2b-56f6-42f4-a0b6-796f31e660cc-webhook-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-psxkf\" (UID: \"f2231f2b-56f6-42f4-a0b6-796f31e660cc\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.625411 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.625269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2231f2b-56f6-42f4-a0b6-796f31e660cc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-psxkf\" (UID: \"f2231f2b-56f6-42f4-a0b6-796f31e660cc\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.625411 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.625302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q2q4\" (UniqueName: \"kubernetes.io/projected/f2231f2b-56f6-42f4-a0b6-796f31e660cc-kube-api-access-7q2q4\") pod \"opendatahub-operator-controller-manager-698574c4f-psxkf\" (UID: \"f2231f2b-56f6-42f4-a0b6-796f31e660cc\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.627690 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.627663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2231f2b-56f6-42f4-a0b6-796f31e660cc-apiservice-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-psxkf\" (UID: \"f2231f2b-56f6-42f4-a0b6-796f31e660cc\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.627690 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.627684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2231f2b-56f6-42f4-a0b6-796f31e660cc-webhook-cert\") pod \"opendatahub-operator-controller-manager-698574c4f-psxkf\" (UID: \"f2231f2b-56f6-42f4-a0b6-796f31e660cc\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.650840 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.650800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q2q4\" (UniqueName: \"kubernetes.io/projected/f2231f2b-56f6-42f4-a0b6-796f31e660cc-kube-api-access-7q2q4\") pod \"opendatahub-operator-controller-manager-698574c4f-psxkf\" (UID: \"f2231f2b-56f6-42f4-a0b6-796f31e660cc\") " pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.756689 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.756606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:35.902637 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.902602 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf"] May 06 17:15:35.905494 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:15:35.905465 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2231f2b_56f6_42f4_a0b6_796f31e660cc.slice/crio-600780c75f207e59c5ac8a25568a0776acccaf35163297cfebe21a2049137242 WatchSource:0}: Error finding container 600780c75f207e59c5ac8a25568a0776acccaf35163297cfebe21a2049137242: Status 404 returned error can't find the container with id 600780c75f207e59c5ac8a25568a0776acccaf35163297cfebe21a2049137242 May 06 17:15:35.907143 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:35.907127 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 06 17:15:36.038475 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:36.038444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" event={"ID":"f2231f2b-56f6-42f4-a0b6-796f31e660cc","Type":"ContainerStarted","Data":"600780c75f207e59c5ac8a25568a0776acccaf35163297cfebe21a2049137242"} May 06 17:15:39.049819 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:39.049785 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" event={"ID":"f2231f2b-56f6-42f4-a0b6-796f31e660cc","Type":"ContainerStarted","Data":"fe195dacceab83314b2a71ce6896a443639ddaa42b4904e90a40d98bec48237d"} May 06 17:15:39.050184 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:39.049930 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:39.071274 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:39.071216 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" podStartSLOduration=1.361421837 podStartE2EDuration="4.07120208s" podCreationTimestamp="2026-05-06 17:15:35 +0000 UTC" firstStartedPulling="2026-05-06 17:15:35.907285674 +0000 UTC m=+327.470440745" lastFinishedPulling="2026-05-06 17:15:38.617065904 +0000 UTC m=+330.180220988" observedRunningTime="2026-05-06 17:15:39.070561162 +0000 UTC m=+330.633716265" watchObservedRunningTime="2026-05-06 17:15:39.07120208 +0000 UTC m=+330.634357171" May 06 17:15:50.055860 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:50.055834 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-698574c4f-psxkf" May 06 17:15:56.313673 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.313593 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp"] May 06 17:15:56.316731 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.316712 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.318936 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.318913 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" May 06 17:15:56.318936 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.318932 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" May 06 17:15:56.319090 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.318969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-6grhb\"" May 06 17:15:56.323939 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.323918 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp"] May 06 17:15:56.384836 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.384808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc0e3cf-aae0-4671-be00-8d0656f50a8c-tls-certs\") pod \"kube-auth-proxy-69f8cf9d8c-87zfp\" (UID: \"fcc0e3cf-aae0-4671-be00-8d0656f50a8c\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.384990 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.384850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcc0e3cf-aae0-4671-be00-8d0656f50a8c-tmp\") pod \"kube-auth-proxy-69f8cf9d8c-87zfp\" (UID: \"fcc0e3cf-aae0-4671-be00-8d0656f50a8c\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.384990 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.384872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkftj\" (UniqueName: \"kubernetes.io/projected/fcc0e3cf-aae0-4671-be00-8d0656f50a8c-kube-api-access-vkftj\") pod \"kube-auth-proxy-69f8cf9d8c-87zfp\" (UID: \"fcc0e3cf-aae0-4671-be00-8d0656f50a8c\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.486135 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.486102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc0e3cf-aae0-4671-be00-8d0656f50a8c-tls-certs\") pod \"kube-auth-proxy-69f8cf9d8c-87zfp\" (UID: \"fcc0e3cf-aae0-4671-be00-8d0656f50a8c\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.486301 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.486152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcc0e3cf-aae0-4671-be00-8d0656f50a8c-tmp\") pod \"kube-auth-proxy-69f8cf9d8c-87zfp\" (UID: \"fcc0e3cf-aae0-4671-be00-8d0656f50a8c\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.486301 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.486269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkftj\" (UniqueName: \"kubernetes.io/projected/fcc0e3cf-aae0-4671-be00-8d0656f50a8c-kube-api-access-vkftj\") pod \"kube-auth-proxy-69f8cf9d8c-87zfp\" (UID: \"fcc0e3cf-aae0-4671-be00-8d0656f50a8c\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.488409 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.488383 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fcc0e3cf-aae0-4671-be00-8d0656f50a8c-tmp\") pod \"kube-auth-proxy-69f8cf9d8c-87zfp\" (UID: \"fcc0e3cf-aae0-4671-be00-8d0656f50a8c\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.488527 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.488513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc0e3cf-aae0-4671-be00-8d0656f50a8c-tls-certs\") pod \"kube-auth-proxy-69f8cf9d8c-87zfp\" (UID: \"fcc0e3cf-aae0-4671-be00-8d0656f50a8c\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.494941 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.494920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkftj\" (UniqueName: \"kubernetes.io/projected/fcc0e3cf-aae0-4671-be00-8d0656f50a8c-kube-api-access-vkftj\") pod \"kube-auth-proxy-69f8cf9d8c-87zfp\" (UID: \"fcc0e3cf-aae0-4671-be00-8d0656f50a8c\") " pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.571589 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.571525 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-qs8wx"] May 06 17:15:56.574591 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.574576 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:15:56.576770 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.576754 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" May 06 17:15:56.577291 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.577274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-xsvnt\"" May 06 17:15:56.583543 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.583524 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-qs8wx"] May 06 17:15:56.626341 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.626320 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" May 06 17:15:56.688710 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.688663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr962\" (UniqueName: \"kubernetes.io/projected/39f19d72-d186-4eed-8222-003727ac8098-kube-api-access-dr962\") pod \"odh-model-controller-858dbf95b8-qs8wx\" (UID: \"39f19d72-d186-4eed-8222-003727ac8098\") " pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:15:56.688844 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.688785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39f19d72-d186-4eed-8222-003727ac8098-cert\") pod \"odh-model-controller-858dbf95b8-qs8wx\" (UID: \"39f19d72-d186-4eed-8222-003727ac8098\") " pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:15:56.749031 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.749000 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp"] May 06 17:15:56.751894 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:15:56.751864 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcc0e3cf_aae0_4671_be00_8d0656f50a8c.slice/crio-38d6ec6fe36aff9a04fafd1ac226ad89543e9b6712e171824cb36d4e13c65d4f WatchSource:0}: Error finding container 38d6ec6fe36aff9a04fafd1ac226ad89543e9b6712e171824cb36d4e13c65d4f: Status 404 returned error can't find the container with id 38d6ec6fe36aff9a04fafd1ac226ad89543e9b6712e171824cb36d4e13c65d4f May 06 17:15:56.789345 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.789314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39f19d72-d186-4eed-8222-003727ac8098-cert\") pod \"odh-model-controller-858dbf95b8-qs8wx\" (UID: \"39f19d72-d186-4eed-8222-003727ac8098\") " pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:15:56.789449 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.789361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dr962\" (UniqueName: \"kubernetes.io/projected/39f19d72-d186-4eed-8222-003727ac8098-kube-api-access-dr962\") pod \"odh-model-controller-858dbf95b8-qs8wx\" (UID: \"39f19d72-d186-4eed-8222-003727ac8098\") " pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:15:56.789489 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:15:56.789456 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found May 06 17:15:56.789522 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:15:56.789513 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f19d72-d186-4eed-8222-003727ac8098-cert podName:39f19d72-d186-4eed-8222-003727ac8098 nodeName:}" failed. No retries permitted until 2026-05-06 17:15:57.289496673 +0000 UTC m=+348.852651758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/39f19d72-d186-4eed-8222-003727ac8098-cert") pod "odh-model-controller-858dbf95b8-qs8wx" (UID: "39f19d72-d186-4eed-8222-003727ac8098") : secret "odh-model-controller-webhook-cert" not found May 06 17:15:56.797796 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:56.797772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr962\" (UniqueName: \"kubernetes.io/projected/39f19d72-d186-4eed-8222-003727ac8098-kube-api-access-dr962\") pod \"odh-model-controller-858dbf95b8-qs8wx\" (UID: \"39f19d72-d186-4eed-8222-003727ac8098\") " pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:15:57.105143 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:57.105108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" event={"ID":"fcc0e3cf-aae0-4671-be00-8d0656f50a8c","Type":"ContainerStarted","Data":"38d6ec6fe36aff9a04fafd1ac226ad89543e9b6712e171824cb36d4e13c65d4f"} May 06 17:15:57.293756 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:57.293722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39f19d72-d186-4eed-8222-003727ac8098-cert\") pod \"odh-model-controller-858dbf95b8-qs8wx\" (UID: \"39f19d72-d186-4eed-8222-003727ac8098\") " pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:15:57.293914 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:15:57.293869 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found May 06 17:15:57.293968 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:15:57.293958 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/39f19d72-d186-4eed-8222-003727ac8098-cert podName:39f19d72-d186-4eed-8222-003727ac8098 nodeName:}" failed. No retries permitted until 2026-05-06 17:15:58.293937971 +0000 UTC m=+349.857093058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/39f19d72-d186-4eed-8222-003727ac8098-cert") pod "odh-model-controller-858dbf95b8-qs8wx" (UID: "39f19d72-d186-4eed-8222-003727ac8098") : secret "odh-model-controller-webhook-cert" not found May 06 17:15:58.303590 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:58.303553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39f19d72-d186-4eed-8222-003727ac8098-cert\") pod \"odh-model-controller-858dbf95b8-qs8wx\" (UID: \"39f19d72-d186-4eed-8222-003727ac8098\") " pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:15:58.306212 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:58.306187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/39f19d72-d186-4eed-8222-003727ac8098-cert\") pod \"odh-model-controller-858dbf95b8-qs8wx\" (UID: \"39f19d72-d186-4eed-8222-003727ac8098\") " pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:15:58.384500 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:58.384462 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:15:58.784411 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:58.784386 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-qs8wx"] May 06 17:15:58.786549 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:15:58.786519 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39f19d72_d186_4eed_8222_003727ac8098.slice/crio-27ef6d39d71a9c59893de2a2c76bdbfb71ef1289605a8b4a08653c72902e2b72 WatchSource:0}: Error finding container 27ef6d39d71a9c59893de2a2c76bdbfb71ef1289605a8b4a08653c72902e2b72: Status 404 returned error can't find the container with id 27ef6d39d71a9c59893de2a2c76bdbfb71ef1289605a8b4a08653c72902e2b72 May 06 17:15:59.112982 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:15:59.112948 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" event={"ID":"39f19d72-d186-4eed-8222-003727ac8098","Type":"ContainerStarted","Data":"27ef6d39d71a9c59893de2a2c76bdbfb71ef1289605a8b4a08653c72902e2b72"} May 06 17:16:02.126604 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.126573 2576 generic.go:358] "Generic (PLEG): container finished" podID="39f19d72-d186-4eed-8222-003727ac8098" containerID="381385c9c9de1cc2179a4d934f5cb6f2335678e2696f3618db51ced8ed2c795d" exitCode=1 May 06 17:16:02.127002 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.126659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" event={"ID":"39f19d72-d186-4eed-8222-003727ac8098","Type":"ContainerDied","Data":"381385c9c9de1cc2179a4d934f5cb6f2335678e2696f3618db51ced8ed2c795d"} May 06 17:16:02.127002 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.126904 2576 scope.go:117] "RemoveContainer" containerID="381385c9c9de1cc2179a4d934f5cb6f2335678e2696f3618db51ced8ed2c795d" May 06 17:16:02.128046 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.128028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" event={"ID":"fcc0e3cf-aae0-4671-be00-8d0656f50a8c","Type":"ContainerStarted","Data":"d9c1687d2e1a2e6d33bf1dce654ffe581a1e2fc097765e7718e06901be2d7a20"} May 06 17:16:02.134251 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.134218 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-qctrv"] May 06 17:16:02.137135 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.137120 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:02.143738 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.143725 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" May 06 17:16:02.143802 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.143777 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-djlbd\"" May 06 17:16:02.149488 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.149464 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-qctrv"] May 06 17:16:02.219166 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.219097 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-69f8cf9d8c-87zfp" podStartSLOduration=1.929565634 podStartE2EDuration="6.219084559s" podCreationTimestamp="2026-05-06 17:15:56 +0000 UTC" firstStartedPulling="2026-05-06 17:15:56.75376982 +0000 UTC m=+348.316924892" lastFinishedPulling="2026-05-06 17:16:01.043288726 +0000 UTC m=+352.606443817" observedRunningTime="2026-05-06 17:16:02.218108405 +0000 UTC m=+353.781263497" watchObservedRunningTime="2026-05-06 17:16:02.219084559 +0000 UTC m=+353.782239698" May 06 17:16:02.233716 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.233695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c79ce9-c833-4a16-a028-c31791a3a267-cert\") pod \"kserve-controller-manager-856948b99f-qctrv\" (UID: \"b7c79ce9-c833-4a16-a028-c31791a3a267\") " pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:02.233844 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.233732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ncll\" (UniqueName: \"kubernetes.io/projected/b7c79ce9-c833-4a16-a028-c31791a3a267-kube-api-access-9ncll\") pod \"kserve-controller-manager-856948b99f-qctrv\" (UID: \"b7c79ce9-c833-4a16-a028-c31791a3a267\") " pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:02.335085 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.335055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c79ce9-c833-4a16-a028-c31791a3a267-cert\") pod \"kserve-controller-manager-856948b99f-qctrv\" (UID: \"b7c79ce9-c833-4a16-a028-c31791a3a267\") " pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:02.335275 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.335099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ncll\" (UniqueName: \"kubernetes.io/projected/b7c79ce9-c833-4a16-a028-c31791a3a267-kube-api-access-9ncll\") pod \"kserve-controller-manager-856948b99f-qctrv\" (UID: \"b7c79ce9-c833-4a16-a028-c31791a3a267\") " pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:02.335275 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:16:02.335199 2576 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found May 06 17:16:02.335398 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:16:02.335282 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7c79ce9-c833-4a16-a028-c31791a3a267-cert podName:b7c79ce9-c833-4a16-a028-c31791a3a267 nodeName:}" failed. No retries permitted until 2026-05-06 17:16:02.835267105 +0000 UTC m=+354.398422174 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b7c79ce9-c833-4a16-a028-c31791a3a267-cert") pod "kserve-controller-manager-856948b99f-qctrv" (UID: "b7c79ce9-c833-4a16-a028-c31791a3a267") : secret "kserve-webhook-server-cert" not found May 06 17:16:02.349014 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.348955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ncll\" (UniqueName: \"kubernetes.io/projected/b7c79ce9-c833-4a16-a028-c31791a3a267-kube-api-access-9ncll\") pod \"kserve-controller-manager-856948b99f-qctrv\" (UID: \"b7c79ce9-c833-4a16-a028-c31791a3a267\") " pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:02.839525 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.839499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c79ce9-c833-4a16-a028-c31791a3a267-cert\") pod \"kserve-controller-manager-856948b99f-qctrv\" (UID: \"b7c79ce9-c833-4a16-a028-c31791a3a267\") " pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:02.841693 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:02.841667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7c79ce9-c833-4a16-a028-c31791a3a267-cert\") pod \"kserve-controller-manager-856948b99f-qctrv\" (UID: \"b7c79ce9-c833-4a16-a028-c31791a3a267\") " pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:03.046928 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:03.046901 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:03.133179 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:03.133148 2576 generic.go:358] "Generic (PLEG): container finished" podID="39f19d72-d186-4eed-8222-003727ac8098" containerID="c4e7c4c11fe9b4e734c88fd86bb3c7131acd9d002a170c50edbe6b2aebd603a7" exitCode=1 May 06 17:16:03.133557 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:03.133254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" event={"ID":"39f19d72-d186-4eed-8222-003727ac8098","Type":"ContainerDied","Data":"c4e7c4c11fe9b4e734c88fd86bb3c7131acd9d002a170c50edbe6b2aebd603a7"} May 06 17:16:03.133557 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:03.133300 2576 scope.go:117] "RemoveContainer" containerID="381385c9c9de1cc2179a4d934f5cb6f2335678e2696f3618db51ced8ed2c795d" May 06 17:16:03.133557 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:03.133517 2576 scope.go:117] "RemoveContainer" containerID="c4e7c4c11fe9b4e734c88fd86bb3c7131acd9d002a170c50edbe6b2aebd603a7" May 06 17:16:03.133732 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:16:03.133713 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-qs8wx_opendatahub(39f19d72-d186-4eed-8222-003727ac8098)\"" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" podUID="39f19d72-d186-4eed-8222-003727ac8098" May 06 17:16:03.194101 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:03.194073 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-qctrv"] May 06 17:16:03.197557 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:16:03.197531 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7c79ce9_c833_4a16_a028_c31791a3a267.slice/crio-aae454bd1a53e2cce2fac0a3904238e51bb63f291552b03259170a8d694a4ad4 WatchSource:0}: Error finding container aae454bd1a53e2cce2fac0a3904238e51bb63f291552b03259170a8d694a4ad4: Status 404 returned error can't find the container with id aae454bd1a53e2cce2fac0a3904238e51bb63f291552b03259170a8d694a4ad4 May 06 17:16:04.138973 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:04.138940 2576 scope.go:117] "RemoveContainer" containerID="c4e7c4c11fe9b4e734c88fd86bb3c7131acd9d002a170c50edbe6b2aebd603a7" May 06 17:16:04.139418 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:16:04.139177 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-qs8wx_opendatahub(39f19d72-d186-4eed-8222-003727ac8098)\"" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" podUID="39f19d72-d186-4eed-8222-003727ac8098" May 06 17:16:04.140001 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:04.139969 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" event={"ID":"b7c79ce9-c833-4a16-a028-c31791a3a267","Type":"ContainerStarted","Data":"aae454bd1a53e2cce2fac0a3904238e51bb63f291552b03259170a8d694a4ad4"} May 06 17:16:06.600143 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.600112 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm"] May 06 17:16:06.602949 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.602931 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" May 06 17:16:06.606246 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.606200 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" May 06 17:16:06.606346 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.606328 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-kdxnk\"" May 06 17:16:06.606389 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.606364 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" May 06 17:16:06.618746 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.618726 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm"] May 06 17:16:06.669873 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.669848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f0c7102b-9fae-430e-b394-a005ceda513a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-kvmcm\" (UID: \"f0c7102b-9fae-430e-b394-a005ceda513a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" May 06 17:16:06.670004 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.669889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpsmr\" (UniqueName: \"kubernetes.io/projected/f0c7102b-9fae-430e-b394-a005ceda513a-kube-api-access-mpsmr\") pod \"servicemesh-operator3-55f49c5f94-kvmcm\" (UID: \"f0c7102b-9fae-430e-b394-a005ceda513a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" May 06 17:16:06.770248 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.770205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f0c7102b-9fae-430e-b394-a005ceda513a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-kvmcm\" (UID: \"f0c7102b-9fae-430e-b394-a005ceda513a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" May 06 17:16:06.770368 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.770264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpsmr\" (UniqueName: \"kubernetes.io/projected/f0c7102b-9fae-430e-b394-a005ceda513a-kube-api-access-mpsmr\") pod \"servicemesh-operator3-55f49c5f94-kvmcm\" (UID: \"f0c7102b-9fae-430e-b394-a005ceda513a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" May 06 17:16:06.772589 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.772563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f0c7102b-9fae-430e-b394-a005ceda513a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-kvmcm\" (UID: \"f0c7102b-9fae-430e-b394-a005ceda513a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" May 06 17:16:06.779638 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.779614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpsmr\" (UniqueName: \"kubernetes.io/projected/f0c7102b-9fae-430e-b394-a005ceda513a-kube-api-access-mpsmr\") pod \"servicemesh-operator3-55f49c5f94-kvmcm\" (UID: \"f0c7102b-9fae-430e-b394-a005ceda513a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" May 06 17:16:06.912436 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:06.912373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" May 06 17:16:07.040982 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:07.040950 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm"] May 06 17:16:07.043928 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:16:07.043898 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0c7102b_9fae_430e_b394_a005ceda513a.slice/crio-3c99cb0d002fd38b1aaa1cbc1c20a02d2ffc14cf1398417140fa60e0d5e7e1f0 WatchSource:0}: Error finding container 3c99cb0d002fd38b1aaa1cbc1c20a02d2ffc14cf1398417140fa60e0d5e7e1f0: Status 404 returned error can't find the container with id 3c99cb0d002fd38b1aaa1cbc1c20a02d2ffc14cf1398417140fa60e0d5e7e1f0 May 06 17:16:07.156121 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:07.156091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" event={"ID":"f0c7102b-9fae-430e-b394-a005ceda513a","Type":"ContainerStarted","Data":"3c99cb0d002fd38b1aaa1cbc1c20a02d2ffc14cf1398417140fa60e0d5e7e1f0"} May 06 17:16:07.157272 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:07.157248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" event={"ID":"b7c79ce9-c833-4a16-a028-c31791a3a267","Type":"ContainerStarted","Data":"14203f0f1fbe013a598cf572bd8cb1d0a1bb505d4fd4da34d9d82c7a32ee41f8"} May 06 17:16:07.157430 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:07.157414 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:07.181363 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:07.181293 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" podStartSLOduration=2.126056938 podStartE2EDuration="5.181280352s" podCreationTimestamp="2026-05-06 17:16:02 +0000 UTC" firstStartedPulling="2026-05-06 17:16:03.198767251 +0000 UTC m=+354.761922324" lastFinishedPulling="2026-05-06 17:16:06.253990656 +0000 UTC m=+357.817145738" observedRunningTime="2026-05-06 17:16:07.180333977 +0000 UTC m=+358.743489068" watchObservedRunningTime="2026-05-06 17:16:07.181280352 +0000 UTC m=+358.744435443" May 06 17:16:08.384849 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:08.384821 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:16:08.385254 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:08.385152 2576 scope.go:117] "RemoveContainer" containerID="c4e7c4c11fe9b4e734c88fd86bb3c7131acd9d002a170c50edbe6b2aebd603a7" May 06 17:16:08.385353 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:16:08.385336 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-qs8wx_opendatahub(39f19d72-d186-4eed-8222-003727ac8098)\"" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" podUID="39f19d72-d186-4eed-8222-003727ac8098" May 06 17:16:12.176999 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:12.176960 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" event={"ID":"f0c7102b-9fae-430e-b394-a005ceda513a","Type":"ContainerStarted","Data":"5ebed037b60e0adf1f1f0f44e82ec5c5b966b6417c437eb0ed75509beb48cf2a"} May 06 17:16:12.177524 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:12.177046 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" May 06 17:16:12.199291 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:12.199222 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" podStartSLOduration=1.563052375 podStartE2EDuration="6.199207707s" podCreationTimestamp="2026-05-06 17:16:06 +0000 UTC" firstStartedPulling="2026-05-06 17:16:07.046387656 +0000 UTC m=+358.609542726" lastFinishedPulling="2026-05-06 17:16:11.682542985 +0000 UTC m=+363.245698058" observedRunningTime="2026-05-06 17:16:12.197932452 +0000 UTC m=+363.761087544" watchObservedRunningTime="2026-05-06 17:16:12.199207707 +0000 UTC m=+363.762362800" May 06 17:16:18.384769 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:18.384734 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:16:18.385281 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:18.385222 2576 scope.go:117] "RemoveContainer" containerID="c4e7c4c11fe9b4e734c88fd86bb3c7131acd9d002a170c50edbe6b2aebd603a7" May 06 17:16:19.199395 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:19.199363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" event={"ID":"39f19d72-d186-4eed-8222-003727ac8098","Type":"ContainerStarted","Data":"bb012243b3dac42ac0ec1f18ef1a732c377361a6fd0c0ba2741e843964ce74bf"} May 06 17:16:19.199573 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:19.199548 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:16:19.221376 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:19.221332 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" podStartSLOduration=3.349950192 podStartE2EDuration="23.221320695s" podCreationTimestamp="2026-05-06 17:15:56 +0000 UTC" firstStartedPulling="2026-05-06 17:15:58.787876966 +0000 UTC m=+350.351032039" lastFinishedPulling="2026-05-06 17:16:18.659247468 +0000 UTC m=+370.222402542" observedRunningTime="2026-05-06 17:16:19.220391854 +0000 UTC m=+370.783546948" watchObservedRunningTime="2026-05-06 17:16:19.221320695 +0000 UTC m=+370.784475832" May 06 17:16:23.182051 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:23.182011 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-kvmcm" May 06 17:16:30.204945 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:30.204916 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-qs8wx" May 06 17:16:33.200799 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.200766 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg"] May 06 17:16:33.206473 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.206451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.208996 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.208972 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" May 06 17:16:33.209116 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.209053 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" May 06 17:16:33.209116 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.209081 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-qddmg\"" May 06 17:16:33.209260 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.209019 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-gw-ca-root-cert\"" May 06 17:16:33.209260 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.209056 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" May 06 17:16:33.218956 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.218931 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg"] May 06 17:16:33.279156 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.279126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/3f660e87-5baf-4972-b849-6bf52968f521-local-certs\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.279335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.279185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/3f660e87-5baf-4972-b849-6bf52968f521-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.279335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.279260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/3f660e87-5baf-4972-b849-6bf52968f521-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.279335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.279295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/3f660e87-5baf-4972-b849-6bf52968f521-cacerts\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.279335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.279314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw2t2\" (UniqueName: \"kubernetes.io/projected/3f660e87-5baf-4972-b849-6bf52968f521-kube-api-access-rw2t2\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.279335 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.279335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3f660e87-5baf-4972-b849-6bf52968f521-istio-kubeconfig\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.279494 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.279359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3f660e87-5baf-4972-b849-6bf52968f521-istio-token\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.380457 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.380420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3f660e87-5baf-4972-b849-6bf52968f521-istio-kubeconfig\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.380621 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.380464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3f660e87-5baf-4972-b849-6bf52968f521-istio-token\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.380621 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.380587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/3f660e87-5baf-4972-b849-6bf52968f521-local-certs\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.380743 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.380701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/3f660e87-5baf-4972-b849-6bf52968f521-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.380786 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.380772 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/3f660e87-5baf-4972-b849-6bf52968f521-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.380836 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.380813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/3f660e87-5baf-4972-b849-6bf52968f521-cacerts\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.381095 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.381063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rw2t2\" (UniqueName: \"kubernetes.io/projected/3f660e87-5baf-4972-b849-6bf52968f521-kube-api-access-rw2t2\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.381323 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.381294 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/3f660e87-5baf-4972-b849-6bf52968f521-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.383085 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.383056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/3f660e87-5baf-4972-b849-6bf52968f521-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.383199 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.383094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/3f660e87-5baf-4972-b849-6bf52968f521-local-certs\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.383199 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.383156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/3f660e87-5baf-4972-b849-6bf52968f521-cacerts\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.383323 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.383216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/3f660e87-5baf-4972-b849-6bf52968f521-istio-kubeconfig\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.391431 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.391408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw2t2\" (UniqueName: \"kubernetes.io/projected/3f660e87-5baf-4972-b849-6bf52968f521-kube-api-access-rw2t2\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.392080 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.392054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3f660e87-5baf-4972-b849-6bf52968f521-istio-token\") pod \"istiod-openshift-gateway-798958bb55-d68fg\" (UID: \"3f660e87-5baf-4972-b849-6bf52968f521\") " pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.516615 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.516532 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:33.653512 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:33.653485 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg"] May 06 17:16:33.656280 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:16:33.656248 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f660e87_5baf_4972_b849_6bf52968f521.slice/crio-210abfd6995c12930b229870996880d3d102d25f052ff0cc594e60b76af61a92 WatchSource:0}: Error finding container 210abfd6995c12930b229870996880d3d102d25f052ff0cc594e60b76af61a92: Status 404 returned error can't find the container with id 210abfd6995c12930b229870996880d3d102d25f052ff0cc594e60b76af61a92 May 06 17:16:34.250722 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:34.250678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" event={"ID":"3f660e87-5baf-4972-b849-6bf52968f521","Type":"ContainerStarted","Data":"210abfd6995c12930b229870996880d3d102d25f052ff0cc594e60b76af61a92"} May 06 17:16:36.821425 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:36.821378 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} May 06 17:16:36.821723 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:36.821469 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} May 06 17:16:37.262956 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:37.262867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" event={"ID":"3f660e87-5baf-4972-b849-6bf52968f521","Type":"ContainerStarted","Data":"aeccf9d8a36c91676586316748afeb6e8b4e40ca478108432c2471b77b9ba93f"} May 06 17:16:37.263110 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:37.263060 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:16:37.264812 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:37.264788 2576 patch_prober.go:28] interesting pod/istiod-openshift-gateway-798958bb55-d68fg container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= May 06 17:16:37.264913 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:37.264831 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" podUID="3f660e87-5baf-4972-b849-6bf52968f521" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" May 06 17:16:37.290593 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:37.290549 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" podStartSLOduration=1.128127719 podStartE2EDuration="4.290534103s" podCreationTimestamp="2026-05-06 17:16:33 +0000 UTC" firstStartedPulling="2026-05-06 17:16:33.65871325 +0000 UTC m=+385.221868324" lastFinishedPulling="2026-05-06 17:16:36.821119637 +0000 UTC m=+388.384274708" observedRunningTime="2026-05-06 17:16:37.289592718 +0000 UTC m=+388.852747811" watchObservedRunningTime="2026-05-06 17:16:37.290534103 +0000 UTC m=+388.853689194" May 06 17:16:38.168097 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:38.168066 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-qctrv" May 06 17:16:38.266972 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:16:38.266942 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-798958bb55-d68fg" May 06 17:17:33.093131 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.093040 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76"] May 06 17:17:33.095203 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.095186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" May 06 17:17:33.099497 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.099475 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" May 06 17:17:33.099593 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.099475 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-2mccw\"" May 06 17:17:33.100501 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.100476 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" May 06 17:17:33.111572 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.111541 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76"] May 06 17:17:33.160975 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.160945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8k8g\" (UniqueName: \"kubernetes.io/projected/14650863-45bd-4ade-a7b1-3117d0af7331-kube-api-access-c8k8g\") pod \"limitador-operator-controller-manager-85c4996f8c-plg76\" (UID: \"14650863-45bd-4ade-a7b1-3117d0af7331\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" May 06 17:17:33.262146 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.262122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8k8g\" (UniqueName: \"kubernetes.io/projected/14650863-45bd-4ade-a7b1-3117d0af7331-kube-api-access-c8k8g\") pod \"limitador-operator-controller-manager-85c4996f8c-plg76\" (UID: \"14650863-45bd-4ade-a7b1-3117d0af7331\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" May 06 17:17:33.272872 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.272850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8k8g\" (UniqueName: \"kubernetes.io/projected/14650863-45bd-4ade-a7b1-3117d0af7331-kube-api-access-c8k8g\") pod \"limitador-operator-controller-manager-85c4996f8c-plg76\" (UID: \"14650863-45bd-4ade-a7b1-3117d0af7331\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" May 06 17:17:33.405106 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.405038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" May 06 17:17:33.545628 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:33.545605 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76"] May 06 17:17:33.547895 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:17:33.547856 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14650863_45bd_4ade_a7b1_3117d0af7331.slice/crio-8ba0805a526dc9680086904b7deab575b5ab9fa51ad424b8d824a6205ef75574 WatchSource:0}: Error finding container 8ba0805a526dc9680086904b7deab575b5ab9fa51ad424b8d824a6205ef75574: Status 404 returned error can't find the container with id 8ba0805a526dc9680086904b7deab575b5ab9fa51ad424b8d824a6205ef75574 May 06 17:17:34.445162 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:34.445124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" event={"ID":"14650863-45bd-4ade-a7b1-3117d0af7331","Type":"ContainerStarted","Data":"8ba0805a526dc9680086904b7deab575b5ab9fa51ad424b8d824a6205ef75574"} May 06 17:17:35.772078 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:35.772047 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz"] May 06 17:17:35.774638 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:35.774615 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz" May 06 17:17:35.777277 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:35.777251 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" May 06 17:17:35.777413 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:35.777224 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-sg85s\"" May 06 17:17:35.784162 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:35.779407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wndmt\" (UniqueName: \"kubernetes.io/projected/14493cea-fdae-4d3b-a1f4-034511a9b419-kube-api-access-wndmt\") pod \"dns-operator-controller-manager-648d5c98bc-jshlz\" (UID: \"14493cea-fdae-4d3b-a1f4-034511a9b419\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz" May 06 17:17:35.794485 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:35.794460 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz"] May 06 17:17:35.879990 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:35.879954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wndmt\" (UniqueName: \"kubernetes.io/projected/14493cea-fdae-4d3b-a1f4-034511a9b419-kube-api-access-wndmt\") pod \"dns-operator-controller-manager-648d5c98bc-jshlz\" (UID: \"14493cea-fdae-4d3b-a1f4-034511a9b419\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz" May 06 17:17:35.893922 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:35.893896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wndmt\" (UniqueName: \"kubernetes.io/projected/14493cea-fdae-4d3b-a1f4-034511a9b419-kube-api-access-wndmt\") pod \"dns-operator-controller-manager-648d5c98bc-jshlz\" (UID: \"14493cea-fdae-4d3b-a1f4-034511a9b419\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz" May 06 17:17:36.093486 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:36.093459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz" May 06 17:17:36.225438 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:36.225413 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz"] May 06 17:17:36.227215 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:17:36.227184 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14493cea_fdae_4d3b_a1f4_034511a9b419.slice/crio-c38eaba8831ad127e8425587c68a3adacd83a442734e64a331f2f33b76c2e398 WatchSource:0}: Error finding container c38eaba8831ad127e8425587c68a3adacd83a442734e64a331f2f33b76c2e398: Status 404 returned error can't find the container with id c38eaba8831ad127e8425587c68a3adacd83a442734e64a331f2f33b76c2e398 May 06 17:17:36.453134 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:36.453052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz" event={"ID":"14493cea-fdae-4d3b-a1f4-034511a9b419","Type":"ContainerStarted","Data":"c38eaba8831ad127e8425587c68a3adacd83a442734e64a331f2f33b76c2e398"} May 06 17:17:36.454418 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:36.454390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" event={"ID":"14650863-45bd-4ade-a7b1-3117d0af7331","Type":"ContainerStarted","Data":"c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01"} May 06 17:17:36.454679 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:36.454664 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" May 06 17:17:36.474089 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:36.474039 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" podStartSLOduration=1.155719285 podStartE2EDuration="3.474027253s" podCreationTimestamp="2026-05-06 17:17:33 +0000 UTC" firstStartedPulling="2026-05-06 17:17:33.549773028 +0000 UTC m=+445.112928097" lastFinishedPulling="2026-05-06 17:17:35.868080995 +0000 UTC m=+447.431236065" observedRunningTime="2026-05-06 17:17:36.472567014 +0000 UTC m=+448.035722118" watchObservedRunningTime="2026-05-06 17:17:36.474027253 +0000 UTC m=+448.037182345" May 06 17:17:39.471159 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:39.471119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz" event={"ID":"14493cea-fdae-4d3b-a1f4-034511a9b419","Type":"ContainerStarted","Data":"abf86c7246e5c802f8696458dabadb86dd9dedd728e6e6f4528dc074801aae10"} May 06 17:17:39.471634 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:39.471275 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz" May 06 17:17:39.497103 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:39.497058 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz" podStartSLOduration=2.023354023 podStartE2EDuration="4.497046523s" podCreationTimestamp="2026-05-06 17:17:35 +0000 UTC" firstStartedPulling="2026-05-06 17:17:36.229290336 +0000 UTC m=+447.792445407" lastFinishedPulling="2026-05-06 17:17:38.702982833 +0000 UTC m=+450.266137907" observedRunningTime="2026-05-06 17:17:39.495321085 +0000 UTC m=+451.058476178" watchObservedRunningTime="2026-05-06 17:17:39.497046523 +0000 UTC m=+451.060201615" May 06 17:17:47.460047 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:47.460009 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" May 06 17:17:50.477425 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.477388 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-jshlz" May 06 17:17:50.631902 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.631872 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76"] May 06 17:17:50.632152 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.632123 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" podUID="14650863-45bd-4ade-a7b1-3117d0af7331" containerName="manager" containerID="cri-o://c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01" gracePeriod=2 May 06 17:17:50.637683 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.637630 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76"] May 06 17:17:50.655141 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.655120 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb"] May 06 17:17:50.655472 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.655458 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14650863-45bd-4ade-a7b1-3117d0af7331" containerName="manager" May 06 17:17:50.655510 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.655474 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="14650863-45bd-4ade-a7b1-3117d0af7331" containerName="manager" May 06 17:17:50.655557 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.655549 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="14650863-45bd-4ade-a7b1-3117d0af7331" containerName="manager" May 06 17:17:50.657615 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.657595 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb" May 06 17:17:50.659632 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.659605 2576 status_manager.go:895] "Failed to get status for pod" podUID="14650863-45bd-4ade-a7b1-3117d0af7331" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" err="pods \"limitador-operator-controller-manager-85c4996f8c-plg76\" is forbidden: User \"system:node:ip-10-0-135-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-110.ec2.internal' and this object" May 06 17:17:50.678170 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.678145 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb"] May 06 17:17:50.680563 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.680536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlnrk\" (UniqueName: \"kubernetes.io/projected/75941811-3994-4243-87d5-9bff9cb3bc8e-kube-api-access-jlnrk\") pod \"limitador-operator-controller-manager-85c4996f8c-x8gjb\" (UID: \"75941811-3994-4243-87d5-9bff9cb3bc8e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb" May 06 17:17:50.781070 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.781031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlnrk\" (UniqueName: \"kubernetes.io/projected/75941811-3994-4243-87d5-9bff9cb3bc8e-kube-api-access-jlnrk\") pod \"limitador-operator-controller-manager-85c4996f8c-x8gjb\" (UID: \"75941811-3994-4243-87d5-9bff9cb3bc8e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb" May 06 17:17:50.795997 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.795972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlnrk\" (UniqueName: \"kubernetes.io/projected/75941811-3994-4243-87d5-9bff9cb3bc8e-kube-api-access-jlnrk\") pod \"limitador-operator-controller-manager-85c4996f8c-x8gjb\" (UID: \"75941811-3994-4243-87d5-9bff9cb3bc8e\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb" May 06 17:17:50.875878 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.875857 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" May 06 17:17:50.878529 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.878505 2576 status_manager.go:895] "Failed to get status for pod" podUID="14650863-45bd-4ade-a7b1-3117d0af7331" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" err="pods \"limitador-operator-controller-manager-85c4996f8c-plg76\" is forbidden: User \"system:node:ip-10-0-135-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-110.ec2.internal' and this object" May 06 17:17:50.881810 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.881794 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8k8g\" (UniqueName: \"kubernetes.io/projected/14650863-45bd-4ade-a7b1-3117d0af7331-kube-api-access-c8k8g\") pod \"14650863-45bd-4ade-a7b1-3117d0af7331\" (UID: \"14650863-45bd-4ade-a7b1-3117d0af7331\") " May 06 17:17:50.883733 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.883712 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14650863-45bd-4ade-a7b1-3117d0af7331-kube-api-access-c8k8g" (OuterVolumeSpecName: "kube-api-access-c8k8g") pod "14650863-45bd-4ade-a7b1-3117d0af7331" (UID: "14650863-45bd-4ade-a7b1-3117d0af7331"). InnerVolumeSpecName "kube-api-access-c8k8g". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:17:50.951070 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.951042 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14650863-45bd-4ade-a7b1-3117d0af7331" path="/var/lib/kubelet/pods/14650863-45bd-4ade-a7b1-3117d0af7331/volumes" May 06 17:17:50.982267 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:50.982215 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c8k8g\" (UniqueName: \"kubernetes.io/projected/14650863-45bd-4ade-a7b1-3117d0af7331-kube-api-access-c8k8g\") on node \"ip-10-0-135-110.ec2.internal\" DevicePath \"\"" May 06 17:17:51.008342 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.008291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb" May 06 17:17:51.166883 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.166856 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb"] May 06 17:17:51.167905 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:17:51.167880 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75941811_3994_4243_87d5_9bff9cb3bc8e.slice/crio-317b3d171f84d1d4476baa7aac62556911ad58d912b9b4a3d375815971e57006 WatchSource:0}: Error finding container 317b3d171f84d1d4476baa7aac62556911ad58d912b9b4a3d375815971e57006: Status 404 returned error can't find the container with id 317b3d171f84d1d4476baa7aac62556911ad58d912b9b4a3d375815971e57006 May 06 17:17:51.509493 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.509453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb" event={"ID":"75941811-3994-4243-87d5-9bff9cb3bc8e","Type":"ContainerStarted","Data":"52772e2baaec188f1adf114899e75abcc85bd262f1fa97f66413437215b8116e"} May 06 17:17:51.509907 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.509504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb" event={"ID":"75941811-3994-4243-87d5-9bff9cb3bc8e","Type":"ContainerStarted","Data":"317b3d171f84d1d4476baa7aac62556911ad58d912b9b4a3d375815971e57006"} May 06 17:17:51.509907 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.509548 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb" May 06 17:17:51.510628 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.510606 2576 generic.go:358] "Generic (PLEG): container finished" podID="14650863-45bd-4ade-a7b1-3117d0af7331" containerID="c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01" exitCode=0 May 06 17:17:51.510688 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.510656 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-plg76" May 06 17:17:51.510688 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.510674 2576 scope.go:117] "RemoveContainer" containerID="c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01" May 06 17:17:51.521472 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.521456 2576 scope.go:117] "RemoveContainer" containerID="c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01" May 06 17:17:51.521815 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:17:51.521785 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01\": container with ID starting with c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01 not found: ID does not exist" containerID="c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01" May 06 17:17:51.521886 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.521817 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01"} err="failed to get container status \"c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01\": rpc error: code = NotFound desc = could not find container \"c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01\": container with ID starting with c5379a6ba0bc2567cf0bd98be4fb592f7c038f56afdfb7d72a96206061330b01 not found: ID does not exist" May 06 17:17:51.531655 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:17:51.531619 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb" podStartSLOduration=1.531609534 podStartE2EDuration="1.531609534s" podCreationTimestamp="2026-05-06 17:17:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:17:51.529433933 +0000 UTC m=+463.092589051" watchObservedRunningTime="2026-05-06 17:17:51.531609534 +0000 UTC m=+463.094764625" May 06 17:18:02.517431 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:02.517350 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-x8gjb" May 06 17:18:29.009296 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.009261 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-tsq7b"] May 06 17:18:29.013366 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.012628 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" May 06 17:18:29.016940 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.016918 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-mswl4\"" May 06 17:18:29.017261 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.017216 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" May 06 17:18:29.029996 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.029976 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-tsq7b"] May 06 17:18:29.071812 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.071786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-tsq7b"] May 06 17:18:29.083674 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.083648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdkhr\" (UniqueName: \"kubernetes.io/projected/25d6401d-4a4b-4c00-8512-8f2034dc8c43-kube-api-access-cdkhr\") pod \"limitador-limitador-78c99df468-tsq7b\" (UID: \"25d6401d-4a4b-4c00-8512-8f2034dc8c43\") " pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" May 06 17:18:29.083803 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.083702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/25d6401d-4a4b-4c00-8512-8f2034dc8c43-config-file\") pod \"limitador-limitador-78c99df468-tsq7b\" (UID: \"25d6401d-4a4b-4c00-8512-8f2034dc8c43\") " pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" May 06 17:18:29.184888 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.184852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/25d6401d-4a4b-4c00-8512-8f2034dc8c43-config-file\") pod \"limitador-limitador-78c99df468-tsq7b\" (UID: \"25d6401d-4a4b-4c00-8512-8f2034dc8c43\") " pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" May 06 17:18:29.185039 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.184952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdkhr\" (UniqueName: \"kubernetes.io/projected/25d6401d-4a4b-4c00-8512-8f2034dc8c43-kube-api-access-cdkhr\") pod \"limitador-limitador-78c99df468-tsq7b\" (UID: \"25d6401d-4a4b-4c00-8512-8f2034dc8c43\") " pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" May 06 17:18:29.185582 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.185560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/25d6401d-4a4b-4c00-8512-8f2034dc8c43-config-file\") pod \"limitador-limitador-78c99df468-tsq7b\" (UID: \"25d6401d-4a4b-4c00-8512-8f2034dc8c43\") " pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" May 06 17:18:29.201125 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.201102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdkhr\" (UniqueName: \"kubernetes.io/projected/25d6401d-4a4b-4c00-8512-8f2034dc8c43-kube-api-access-cdkhr\") pod \"limitador-limitador-78c99df468-tsq7b\" (UID: \"25d6401d-4a4b-4c00-8512-8f2034dc8c43\") " pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" May 06 17:18:29.325365 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.325333 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" May 06 17:18:29.497426 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.497291 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-tsq7b"] May 06 17:18:29.641341 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:29.641264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" event={"ID":"25d6401d-4a4b-4c00-8512-8f2034dc8c43","Type":"ContainerStarted","Data":"2991424526d00c77a338ca4ad7ca58d738ab7d9b7fa8994d330bc0263a413569"} May 06 17:18:32.653453 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:32.653407 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" event={"ID":"25d6401d-4a4b-4c00-8512-8f2034dc8c43","Type":"ContainerStarted","Data":"e8f0c6edf0741c3c4c82e2f23698319a1f95e82c86460458c24d113c16d6a521"} May 06 17:18:32.653837 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:32.653517 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" May 06 17:18:32.706052 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:32.705990 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" podStartSLOduration=2.118260965 podStartE2EDuration="4.705972419s" podCreationTimestamp="2026-05-06 17:18:28 +0000 UTC" firstStartedPulling="2026-05-06 17:18:29.498182969 +0000 UTC m=+501.061338038" lastFinishedPulling="2026-05-06 17:18:32.085894422 +0000 UTC m=+503.649049492" observedRunningTime="2026-05-06 17:18:32.705172011 +0000 UTC m=+504.268327102" watchObservedRunningTime="2026-05-06 17:18:32.705972419 +0000 UTC m=+504.269127909" May 06 17:18:43.657969 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:18:43.657937 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-tsq7b" May 06 17:20:08.893656 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:20:08.893628 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:20:08.894263 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:20:08.894245 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:20:08.901906 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:20:08.901884 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:20:08.902248 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:20:08.902208 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:25:08.916095 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:25:08.916064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:25:08.919020 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:25:08.918998 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:25:08.925320 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:25:08.925300 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:25:08.927852 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:25:08.927834 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:30:08.940275 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:30:08.940222 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:30:08.943268 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:30:08.943220 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:30:08.948824 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:30:08.948799 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:30:08.952130 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:30:08.952113 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:35:08.963614 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:35:08.963502 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:35:08.968824 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:35:08.968800 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:35:08.972364 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:35:08.972341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:35:08.977434 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:35:08.977416 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:40:08.987792 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:08.987688 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:40:08.994599 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:08.992588 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:40:08.995978 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:08.995955 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:40:09.000207 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:09.000193 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:40:12.782084 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.782048 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44"] May 06 17:40:12.785369 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.785347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:12.788102 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.788085 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-mb796\"" May 06 17:40:12.788216 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.788197 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" May 06 17:40:12.788998 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.788982 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" May 06 17:40:12.789053 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.789017 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" May 06 17:40:12.797603 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.797573 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44"] May 06 17:40:12.953278 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.953243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:12.953449 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.953286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/280bac76-1566-423d-9f5e-c0c8a0aded7b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:12.953449 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.953346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkrrk\" (UniqueName: \"kubernetes.io/projected/280bac76-1566-423d-9f5e-c0c8a0aded7b-kube-api-access-fkrrk\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:12.953449 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.953397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:12.953449 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.953427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:12.953665 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:12.953453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.054794 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.054726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/280bac76-1566-423d-9f5e-c0c8a0aded7b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.054794 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.054781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkrrk\" (UniqueName: \"kubernetes.io/projected/280bac76-1566-423d-9f5e-c0c8a0aded7b-kube-api-access-fkrrk\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.054986 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.054807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.054986 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.054822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.054986 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.054842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.054986 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.054867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.055263 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.055216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.055380 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.055334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.055380 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.055272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.056948 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.056930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/280bac76-1566-423d-9f5e-c0c8a0aded7b-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.057119 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.057103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/280bac76-1566-423d-9f5e-c0c8a0aded7b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.068534 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.068514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkrrk\" (UniqueName: \"kubernetes.io/projected/280bac76-1566-423d-9f5e-c0c8a0aded7b-kube-api-access-fkrrk\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44\" (UID: \"280bac76-1566-423d-9f5e-c0c8a0aded7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.095460 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.095435 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:13.221690 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.221669 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44"] May 06 17:40:13.223698 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:40:13.223672 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280bac76_1566_423d_9f5e_c0c8a0aded7b.slice/crio-000875ee43ff5b423f19d31bd6b113f8116890dceb71b374ae167db64b224101 WatchSource:0}: Error finding container 000875ee43ff5b423f19d31bd6b113f8116890dceb71b374ae167db64b224101: Status 404 returned error can't find the container with id 000875ee43ff5b423f19d31bd6b113f8116890dceb71b374ae167db64b224101 May 06 17:40:13.225411 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.225391 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 06 17:40:13.392991 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:13.392932 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-tsq7b"] May 06 17:40:14.198650 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:14.198604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" event={"ID":"280bac76-1566-423d-9f5e-c0c8a0aded7b","Type":"ContainerStarted","Data":"000875ee43ff5b423f19d31bd6b113f8116890dceb71b374ae167db64b224101"} May 06 17:40:19.219816 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:19.219782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" event={"ID":"280bac76-1566-423d-9f5e-c0c8a0aded7b","Type":"ContainerStarted","Data":"a610fe2d9b4628ee3a6dc929cf805e59c0abe1bbcc4cd84b17cbcfd57aaf53a8"} May 06 17:40:24.240203 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:24.240170 2576 generic.go:358] "Generic (PLEG): container finished" podID="280bac76-1566-423d-9f5e-c0c8a0aded7b" containerID="a610fe2d9b4628ee3a6dc929cf805e59c0abe1bbcc4cd84b17cbcfd57aaf53a8" exitCode=0 May 06 17:40:24.240607 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:24.240254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" event={"ID":"280bac76-1566-423d-9f5e-c0c8a0aded7b","Type":"ContainerDied","Data":"a610fe2d9b4628ee3a6dc929cf805e59c0abe1bbcc4cd84b17cbcfd57aaf53a8"} May 06 17:40:25.016968 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:25.016936 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-tsq7b"] May 06 17:40:26.248934 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:26.248899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" event={"ID":"280bac76-1566-423d-9f5e-c0c8a0aded7b","Type":"ContainerStarted","Data":"c2bcd077dd74fa6bfbbeb0c5c50449ab2db9fcc138eb520bce542225069f998d"} May 06 17:40:26.249361 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:26.249107 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:40:26.271582 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:26.271536 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" podStartSLOduration=2.142503414 podStartE2EDuration="14.271520208s" podCreationTimestamp="2026-05-06 17:40:12 +0000 UTC" firstStartedPulling="2026-05-06 17:40:13.225540878 +0000 UTC m=+1804.788695948" lastFinishedPulling="2026-05-06 17:40:25.354557669 +0000 UTC m=+1816.917712742" observedRunningTime="2026-05-06 17:40:26.270143652 +0000 UTC m=+1817.833298742" watchObservedRunningTime="2026-05-06 17:40:26.271520208 +0000 UTC m=+1817.834675303" May 06 17:40:37.264540 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:40:37.264511 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44" May 06 17:45:00.156931 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:00.156898 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29634825-97rwn"] May 06 17:45:00.160103 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:00.160087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" May 06 17:45:00.163223 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:00.163204 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-7h8kw\"" May 06 17:45:00.170345 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:00.170324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrj5d\" (UniqueName: \"kubernetes.io/projected/b4e12d31-7e09-464b-818c-ee0acb2ed435-kube-api-access-nrj5d\") pod \"maas-api-key-cleanup-29634825-97rwn\" (UID: \"b4e12d31-7e09-464b-818c-ee0acb2ed435\") " pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" May 06 17:45:00.181133 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:00.181110 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29634825-97rwn"] May 06 17:45:00.270902 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:00.270870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrj5d\" (UniqueName: \"kubernetes.io/projected/b4e12d31-7e09-464b-818c-ee0acb2ed435-kube-api-access-nrj5d\") pod \"maas-api-key-cleanup-29634825-97rwn\" (UID: \"b4e12d31-7e09-464b-818c-ee0acb2ed435\") " pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" May 06 17:45:00.280549 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:00.280528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrj5d\" (UniqueName: \"kubernetes.io/projected/b4e12d31-7e09-464b-818c-ee0acb2ed435-kube-api-access-nrj5d\") pod \"maas-api-key-cleanup-29634825-97rwn\" (UID: \"b4e12d31-7e09-464b-818c-ee0acb2ed435\") " pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" May 06 17:45:00.469696 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:00.469632 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" May 06 17:45:00.594676 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:00.594648 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29634825-97rwn"] May 06 17:45:00.596759 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:45:00.596735 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e12d31_7e09_464b_818c_ee0acb2ed435.slice/crio-eba8325d309458c6d6f377cf89dcfe83773779d3385954551b8f00d2a58084d7 WatchSource:0}: Error finding container eba8325d309458c6d6f377cf89dcfe83773779d3385954551b8f00d2a58084d7: Status 404 returned error can't find the container with id eba8325d309458c6d6f377cf89dcfe83773779d3385954551b8f00d2a58084d7 May 06 17:45:01.234342 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:01.234313 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" event={"ID":"b4e12d31-7e09-464b-818c-ee0acb2ed435","Type":"ContainerStarted","Data":"eba8325d309458c6d6f377cf89dcfe83773779d3385954551b8f00d2a58084d7"} May 06 17:45:04.245725 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:04.245693 2576 generic.go:358] "Generic (PLEG): container finished" podID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerID="41acdf87050919c2f2ef5d18423e597ba24a375373de59d1d6b1915e15075b69" exitCode=7 May 06 17:45:04.246081 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:04.245732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" event={"ID":"b4e12d31-7e09-464b-818c-ee0acb2ed435","Type":"ContainerDied","Data":"41acdf87050919c2f2ef5d18423e597ba24a375373de59d1d6b1915e15075b69"} May 06 17:45:04.246081 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:04.245916 2576 scope.go:117] "RemoveContainer" containerID="41acdf87050919c2f2ef5d18423e597ba24a375373de59d1d6b1915e15075b69" May 06 17:45:05.254736 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:05.254375 2576 generic.go:358] "Generic (PLEG): container finished" podID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerID="69c41656fbd7b7cc5f4ad8cc7380483664b9e6cca3246bc2cfa48d8b547008ac" exitCode=7 May 06 17:45:05.254736 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:05.254526 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" event={"ID":"b4e12d31-7e09-464b-818c-ee0acb2ed435","Type":"ContainerDied","Data":"69c41656fbd7b7cc5f4ad8cc7380483664b9e6cca3246bc2cfa48d8b547008ac"} May 06 17:45:05.254736 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:05.254572 2576 scope.go:117] "RemoveContainer" containerID="41acdf87050919c2f2ef5d18423e597ba24a375373de59d1d6b1915e15075b69" May 06 17:45:05.256933 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:05.255702 2576 scope.go:117] "RemoveContainer" containerID="69c41656fbd7b7cc5f4ad8cc7380483664b9e6cca3246bc2cfa48d8b547008ac" May 06 17:45:05.256933 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:45:05.256009 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29634825-97rwn_opendatahub(b4e12d31-7e09-464b-818c-ee0acb2ed435)\"" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" May 06 17:45:06.259969 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:06.259929 2576 scope.go:117] "RemoveContainer" containerID="69c41656fbd7b7cc5f4ad8cc7380483664b9e6cca3246bc2cfa48d8b547008ac" May 06 17:45:06.260346 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:45:06.260120 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29634825-97rwn_opendatahub(b4e12d31-7e09-464b-818c-ee0acb2ed435)\"" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" May 06 17:45:09.012192 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:09.012082 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:45:09.030154 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:09.017893 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:45:09.030154 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:09.020990 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:45:09.030154 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:09.026700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:45:16.946204 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:16.946170 2576 scope.go:117] "RemoveContainer" containerID="69c41656fbd7b7cc5f4ad8cc7380483664b9e6cca3246bc2cfa48d8b547008ac" May 06 17:45:16.947295 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:16.947279 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 06 17:45:17.298969 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:17.298938 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" event={"ID":"b4e12d31-7e09-464b-818c-ee0acb2ed435","Type":"ContainerStarted","Data":"570382e42616a7135217c9a7e422b471459e7fad32033d05b333d51df5152edd"} May 06 17:45:17.325387 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:17.325343 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" podStartSLOduration=14.986136051 podStartE2EDuration="17.325329444s" podCreationTimestamp="2026-05-06 17:45:00 +0000 UTC" firstStartedPulling="2026-05-06 17:45:00.598461363 +0000 UTC m=+2092.161616448" lastFinishedPulling="2026-05-06 17:45:02.937654771 +0000 UTC m=+2094.500809841" observedRunningTime="2026-05-06 17:45:17.322124005 +0000 UTC m=+2108.885279096" watchObservedRunningTime="2026-05-06 17:45:17.325329444 +0000 UTC m=+2108.888484534" May 06 17:45:18.304095 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:18.304063 2576 generic.go:358] "Generic (PLEG): container finished" podID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerID="570382e42616a7135217c9a7e422b471459e7fad32033d05b333d51df5152edd" exitCode=7 May 06 17:45:18.304474 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:18.304134 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" event={"ID":"b4e12d31-7e09-464b-818c-ee0acb2ed435","Type":"ContainerDied","Data":"570382e42616a7135217c9a7e422b471459e7fad32033d05b333d51df5152edd"} May 06 17:45:18.304474 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:18.304177 2576 scope.go:117] "RemoveContainer" containerID="69c41656fbd7b7cc5f4ad8cc7380483664b9e6cca3246bc2cfa48d8b547008ac" May 06 17:45:18.304474 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:18.304424 2576 scope.go:117] "RemoveContainer" containerID="570382e42616a7135217c9a7e422b471459e7fad32033d05b333d51df5152edd" May 06 17:45:18.304669 ip-10-0-135-110 kubenswrapper[2576]: E0506 17:45:18.304643 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cleanup pod=maas-api-key-cleanup-29634825-97rwn_opendatahub(b4e12d31-7e09-464b-818c-ee0acb2ed435)\"" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" May 06 17:45:18.334065 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:18.334041 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29634825-97rwn"] May 06 17:45:19.441305 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:19.441282 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" May 06 17:45:19.510144 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:19.510112 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrj5d\" (UniqueName: \"kubernetes.io/projected/b4e12d31-7e09-464b-818c-ee0acb2ed435-kube-api-access-nrj5d\") pod \"b4e12d31-7e09-464b-818c-ee0acb2ed435\" (UID: \"b4e12d31-7e09-464b-818c-ee0acb2ed435\") " May 06 17:45:19.512127 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:19.512103 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e12d31-7e09-464b-818c-ee0acb2ed435-kube-api-access-nrj5d" (OuterVolumeSpecName: "kube-api-access-nrj5d") pod "b4e12d31-7e09-464b-818c-ee0acb2ed435" (UID: "b4e12d31-7e09-464b-818c-ee0acb2ed435"). InnerVolumeSpecName "kube-api-access-nrj5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 06 17:45:19.611457 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:19.611394 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nrj5d\" (UniqueName: \"kubernetes.io/projected/b4e12d31-7e09-464b-818c-ee0acb2ed435-kube-api-access-nrj5d\") on node \"ip-10-0-135-110.ec2.internal\" DevicePath \"\"" May 06 17:45:20.313621 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:20.313579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" event={"ID":"b4e12d31-7e09-464b-818c-ee0acb2ed435","Type":"ContainerDied","Data":"eba8325d309458c6d6f377cf89dcfe83773779d3385954551b8f00d2a58084d7"} May 06 17:45:20.313803 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:20.313631 2576 scope.go:117] "RemoveContainer" containerID="570382e42616a7135217c9a7e422b471459e7fad32033d05b333d51df5152edd" May 06 17:45:20.313803 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:20.313652 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29634825-97rwn" May 06 17:45:20.333941 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:20.333916 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29634825-97rwn"] May 06 17:45:20.337819 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:20.337801 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29634825-97rwn"] May 06 17:45:20.950618 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:45:20.950582 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" path="/var/lib/kubelet/pods/b4e12d31-7e09-464b-818c-ee0acb2ed435/volumes" May 06 17:46:40.624154 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:40.624114 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-qctrv_b7c79ce9-c833-4a16-a028-c31791a3a267/manager/0.log" May 06 17:46:41.115782 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:41.115750 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-qs8wx_39f19d72-d186-4eed-8222-003727ac8098/manager/2.log" May 06 17:46:41.368595 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:41.368530 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-698574c4f-psxkf_f2231f2b-56f6-42f4-a0b6-796f31e660cc/manager/0.log" May 06 17:46:43.139744 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:43.139718 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-jshlz_14493cea-fdae-4d3b-a1f4-034511a9b419/manager/0.log" May 06 17:46:43.613724 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:43.613693 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-tsq7b_25d6401d-4a4b-4c00-8512-8f2034dc8c43/limitador/0.log" May 06 17:46:43.742123 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:43.742071 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-x8gjb_75941811-3994-4243-87d5-9bff9cb3bc8e/manager/0.log" May 06 17:46:44.219682 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:44.219653 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-798958bb55-d68fg_3f660e87-5baf-4972-b849-6bf52968f521/discovery/0.log" May 06 17:46:44.457937 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:44.457911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-69f8cf9d8c-87zfp_fcc0e3cf-aae0-4671-be00-8d0656f50a8c/kube-auth-proxy/0.log" May 06 17:46:44.701868 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:44.701845 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-b6fc5dcc6-wptnn_0baf8df0-cec6-4632-8692-64dcfb8359a0/router/0.log" May 06 17:46:45.566966 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:45.566941 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44_280bac76-1566-423d-9f5e-c0c8a0aded7b/storage-initializer/0.log" May 06 17:46:45.574710 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:45.574690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-zqg44_280bac76-1566-423d-9f5e-c0c8a0aded7b/main/0.log" May 06 17:46:57.954112 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:57.954082 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-lsdnk_04fcdaac-b196-4dea-a077-864b3ee42652/global-pull-secret-syncer/0.log" May 06 17:46:58.045773 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:58.045749 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lq7gr_6286127c-14fd-44e0-9034-230ab16d2f46/konnectivity-agent/0.log" May 06 17:46:58.186590 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:46:58.186566 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-110.ec2.internal_f6b582af7a45e0c31cfa2ac695736e00/haproxy/0.log" May 06 17:47:02.330105 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:02.330075 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-jshlz_14493cea-fdae-4d3b-a1f4-034511a9b419/manager/0.log" May 06 17:47:02.460691 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:02.460667 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-tsq7b_25d6401d-4a4b-4c00-8512-8f2034dc8c43/limitador/0.log" May 06 17:47:02.486486 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:02.486466 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-x8gjb_75941811-3994-4243-87d5-9bff9cb3bc8e/manager/0.log" May 06 17:47:03.847855 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:03.847827 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-5c487d988c-9kdcj_3d080dd6-8304-4853-9ab3-bd27a2fdd22a/cluster-monitoring-operator/0.log" May 06 17:47:04.009180 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:04.009155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2xccj_5ba6b606-5fa6-4eb9-a5e9-077c683fddec/node-exporter/0.log" May 06 17:47:04.032627 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:04.032608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2xccj_5ba6b606-5fa6-4eb9-a5e9-077c683fddec/kube-rbac-proxy/0.log" May 06 17:47:04.055951 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:04.055932 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2xccj_5ba6b606-5fa6-4eb9-a5e9-077c683fddec/init-textfile/0.log" May 06 17:47:05.944290 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:05.944256 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-697665887d-d4v6f_3bdb3b38-b44f-4385-b653-2e7de1f5dcbc/networking-console-plugin/0.log" May 06 17:47:06.330071 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.330038 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6"] May 06 17:47:06.330440 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.330423 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerName="cleanup" May 06 17:47:06.330440 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.330440 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerName="cleanup" May 06 17:47:06.330581 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.330448 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerName="cleanup" May 06 17:47:06.330581 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.330455 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerName="cleanup" May 06 17:47:06.330581 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.330479 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerName="cleanup" May 06 17:47:06.330581 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.330488 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerName="cleanup" May 06 17:47:06.330581 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.330575 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerName="cleanup" May 06 17:47:06.330888 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.330589 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4e12d31-7e09-464b-818c-ee0acb2ed435" containerName="cleanup" May 06 17:47:06.333612 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.333588 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.335850 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.335825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xnzz7\"/\"openshift-service-ca.crt\"" May 06 17:47:06.336640 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.336621 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xnzz7\"/\"kube-root-ca.crt\"" May 06 17:47:06.336740 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.336641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xnzz7\"/\"default-dockercfg-crpl7\"" May 06 17:47:06.341856 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.341826 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6"] May 06 17:47:06.372788 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.372767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-podres\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.372904 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.372798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zmx9\" (UniqueName: \"kubernetes.io/projected/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-kube-api-access-4zmx9\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.372904 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.372821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-sys\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.372904 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.372842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-proc\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.373053 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.372951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-lib-modules\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.474095 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.474073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-lib-modules\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.474196 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.474118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-podres\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.474196 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.474145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zmx9\" (UniqueName: \"kubernetes.io/projected/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-kube-api-access-4zmx9\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.474196 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.474174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-sys\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.474373 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.474197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-proc\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.474373 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.474258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-lib-modules\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.474373 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.474274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-podres\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.474373 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.474286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-proc\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.474373 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.474311 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-sys\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.482739 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.482720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zmx9\" (UniqueName: \"kubernetes.io/projected/87ccf97e-341d-48dc-bd57-a14c78ce6b3d-kube-api-access-4zmx9\") pod \"perf-node-gather-daemonset-ddvq6\" (UID: \"87ccf97e-341d-48dc-bd57-a14c78ce6b3d\") " pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.527728 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.527704 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/1.log" May 06 17:47:06.534005 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.533985 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-6vwjj_ee523985-adee-4039-8a66-2e7b0a68522a/console-operator/2.log" May 06 17:47:06.645274 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.645206 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:06.770959 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:06.770921 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6"] May 06 17:47:06.773326 ip-10-0-135-110 kubenswrapper[2576]: W0506 17:47:06.773302 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod87ccf97e_341d_48dc_bd57_a14c78ce6b3d.slice/crio-8b01621751e7d30499d848e6cf61303529548afb2265c9c6909cc2b9eb4c6993 WatchSource:0}: Error finding container 8b01621751e7d30499d848e6cf61303529548afb2265c9c6909cc2b9eb4c6993: Status 404 returned error can't find the container with id 8b01621751e7d30499d848e6cf61303529548afb2265c9c6909cc2b9eb4c6993 May 06 17:47:07.568453 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:07.568425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-6648d555c9-w78ss_76b87b63-406b-4b3f-80ef-77d34e6a3f8f/volume-data-source-validator/0.log" May 06 17:47:07.704885 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:07.704847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" event={"ID":"87ccf97e-341d-48dc-bd57-a14c78ce6b3d","Type":"ContainerStarted","Data":"066d4f992a6385d7e00ef490ca5ab6be397a5b4b331cfb1f6905b848f7acd156"} May 06 17:47:07.704885 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:07.704887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" event={"ID":"87ccf97e-341d-48dc-bd57-a14c78ce6b3d","Type":"ContainerStarted","Data":"8b01621751e7d30499d848e6cf61303529548afb2265c9c6909cc2b9eb4c6993"} May 06 17:47:07.705075 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:07.704954 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:07.723577 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:07.723535 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" podStartSLOduration=1.723521023 podStartE2EDuration="1.723521023s" podCreationTimestamp="2026-05-06 17:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-06 17:47:07.721760578 +0000 UTC m=+2219.284915668" watchObservedRunningTime="2026-05-06 17:47:07.723521023 +0000 UTC m=+2219.286676178" May 06 17:47:08.641558 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:08.641510 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xpknn_dca80132-2417-4fcf-b18a-e34cee059964/dns/0.log" May 06 17:47:08.667585 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:08.667556 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xpknn_dca80132-2417-4fcf-b18a-e34cee059964/kube-rbac-proxy/0.log" May 06 17:47:08.746123 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:08.746098 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6jd7q_babb97ac-5bf7-447e-9b34-f306f1a7d566/dns-node-resolver/0.log" May 06 17:47:09.269597 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:09.269573 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-85dtq_9ed404eb-a555-4ae7-b728-791f9d60c831/node-ca/0.log" May 06 17:47:10.327128 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:10.327096 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-798958bb55-d68fg_3f660e87-5baf-4972-b849-6bf52968f521/discovery/0.log" May 06 17:47:10.375026 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:10.374997 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-69f8cf9d8c-87zfp_fcc0e3cf-aae0-4671-be00-8d0656f50a8c/kube-auth-proxy/0.log" May 06 17:47:10.431312 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:10.431273 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-b6fc5dcc6-wptnn_0baf8df0-cec6-4632-8692-64dcfb8359a0/router/0.log" May 06 17:47:10.975868 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:10.975844 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8bzjj_33943d98-8cd7-499f-b152-50856d1a3e54/serve-healthcheck-canary/0.log" May 06 17:47:11.434766 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:11.434732 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-544c98cc96-44vjf_a2139d10-b7a2-47a8-8697-edc7d34841c7/insights-operator/0.log" May 06 17:47:11.436180 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:11.436160 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-544c98cc96-44vjf_a2139d10-b7a2-47a8-8697-edc7d34841c7/insights-operator/1.log" May 06 17:47:11.457799 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:11.457772 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-24dsh_55c3824a-49d2-4a99-aec3-1cf22bb010f4/kube-rbac-proxy/0.log" May 06 17:47:11.483515 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:11.483466 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-24dsh_55c3824a-49d2-4a99-aec3-1cf22bb010f4/exporter/0.log" May 06 17:47:11.507620 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:11.507578 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-24dsh_55c3824a-49d2-4a99-aec3-1cf22bb010f4/extractor/0.log" May 06 17:47:13.553505 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:13.553471 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-qctrv_b7c79ce9-c833-4a16-a028-c31791a3a267/manager/0.log" May 06 17:47:13.633585 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:13.633559 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-qs8wx_39f19d72-d186-4eed-8222-003727ac8098/manager/1.log" May 06 17:47:13.642740 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:13.642719 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-qs8wx_39f19d72-d186-4eed-8222-003727ac8098/manager/2.log" May 06 17:47:13.704355 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:13.704334 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-698574c4f-psxkf_f2231f2b-56f6-42f4-a0b6-796f31e660cc/manager/0.log" May 06 17:47:13.717904 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:13.717886 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xnzz7/perf-node-gather-daemonset-ddvq6" May 06 17:47:20.003694 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:20.003664 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-649b864788-8z9xm_94a34ae8-6335-4d97-84ed-a5c0f421b59a/kube-storage-version-migrator-operator/1.log" May 06 17:47:20.004483 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:20.004466 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-649b864788-8z9xm_94a34ae8-6335-4d97-84ed-a5c0f421b59a/kube-storage-version-migrator-operator/0.log" May 06 17:47:21.060909 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:21.060884 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7wfpq_e4f7af8a-6313-4c92-9c2a-385f8580c399/kube-multus-additional-cni-plugins/0.log" May 06 17:47:21.089938 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:21.089911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7wfpq_e4f7af8a-6313-4c92-9c2a-385f8580c399/egress-router-binary-copy/0.log" May 06 17:47:21.115102 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:21.115079 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7wfpq_e4f7af8a-6313-4c92-9c2a-385f8580c399/cni-plugins/0.log" May 06 17:47:21.137053 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:21.137033 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7wfpq_e4f7af8a-6313-4c92-9c2a-385f8580c399/bond-cni-plugin/0.log" May 06 17:47:21.163875 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:21.163852 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7wfpq_e4f7af8a-6313-4c92-9c2a-385f8580c399/routeoverride-cni/0.log" May 06 17:47:21.187970 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:21.187953 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7wfpq_e4f7af8a-6313-4c92-9c2a-385f8580c399/whereabouts-cni-bincopy/0.log" May 06 17:47:21.211205 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:21.211189 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7wfpq_e4f7af8a-6313-4c92-9c2a-385f8580c399/whereabouts-cni/0.log" May 06 17:47:21.609520 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:21.609498 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ghkv2_fa50d981-80fc-4dbd-83a3-f8f9cef34743/kube-multus/0.log" May 06 17:47:21.806860 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:21.806835 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mvgqp_2c28b880-a50d-4878-bf4e-20dc0f464cc2/network-metrics-daemon/0.log" May 06 17:47:21.864317 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:21.864292 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mvgqp_2c28b880-a50d-4878-bf4e-20dc0f464cc2/kube-rbac-proxy/0.log" May 06 17:47:23.087018 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:23.086987 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-controller/0.log" May 06 17:47:23.106818 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:23.106796 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/0.log" May 06 17:47:23.116643 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:23.116623 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/ovn-acl-logging/1.log" May 06 17:47:23.135303 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:23.135281 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/kube-rbac-proxy-node/0.log" May 06 17:47:23.157051 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:23.157031 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/kube-rbac-proxy-ovn-metrics/0.log" May 06 17:47:23.177194 ip-10-0-135-110 kubenswrapper[2576]: I0506 17:47:23.177174 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bbxnx_4870cbd6-d111-4dd5-b84d-b7abb6469f33/northd/0.log"