May 11 20:50:16.565262 ip-10-0-135-190 systemd[1]: Starting Kubernetes Kubelet... May 11 20:50:17.081140 ip-10-0-135-190 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 11 20:50:17.081140 ip-10-0-135-190 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. May 11 20:50:17.081140 ip-10-0-135-190 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 11 20:50:17.081140 ip-10-0-135-190 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 11 20:50:17.081140 ip-10-0-135-190 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 11 20:50:17.083636 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.083545 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 11 20:50:17.085901 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085877 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:17.085901 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085894 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:17.085901 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085901 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:17.085901 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085905 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085910 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085914 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085918 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085922 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085926 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085930 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085933 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085937 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085941 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085945 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085949 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085953 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085956 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085960 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085964 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085968 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085972 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085975 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085979 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:17.086171 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085983 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085987 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085991 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085994 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.085998 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086002 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086029 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086034 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086038 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086042 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086046 2562 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086050 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086054 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086058 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086063 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086067 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086072 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086077 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086081 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:17.086961 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086085 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086089 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086093 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086097 2562 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086101 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086105 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086109 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086113 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086117 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086122 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086128 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086135 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086140 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086144 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086149 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086153 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086158 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086162 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086166 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086169 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:17.087792 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086174 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086178 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086182 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086186 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086190 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086194 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086198 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086204 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086211 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086216 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086222 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086227 2562 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086231 2562 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086236 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086241 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086245 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086249 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086254 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086260 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:17.088698 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086264 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086268 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086272 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086276 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086281 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086918 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086926 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086931 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086936 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086940 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086944 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086949 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086953 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086957 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086962 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086965 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086970 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086974 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086977 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086981 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:17.089316 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086986 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086991 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.086995 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087000 2562 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087022 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087027 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087033 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087038 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087042 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087046 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087050 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087054 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087059 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087063 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087067 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087071 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087075 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087079 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087084 2562 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087088 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:17.090116 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087092 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087096 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087100 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087105 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087109 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087114 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087118 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087125 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087132 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087138 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087143 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087148 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087153 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087158 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087163 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087167 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087171 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087175 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087180 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:17.090635 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087185 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087189 2562 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087193 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087198 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087202 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087206 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087210 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087215 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087219 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087223 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087227 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087231 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087236 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087240 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087244 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087249 2562 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087253 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087258 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087263 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:17.091157 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087267 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087271 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087275 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087280 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087285 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087290 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087294 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087299 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087303 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087307 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087311 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087315 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.087320 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087427 2562 flags.go:64] FLAG: --address="0.0.0.0" May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087439 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087449 2562 flags.go:64] FLAG: --anonymous-auth="true" May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087456 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087463 2562 flags.go:64] FLAG: --authentication-token-webhook="false" May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087469 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087476 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087483 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" May 11 20:50:17.091680 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087488 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087493 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087499 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087504 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087509 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087514 2562 flags.go:64] FLAG: --cgroup-root="" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087519 2562 flags.go:64] FLAG: --cgroups-per-qos="true" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087524 2562 flags.go:64] FLAG: --client-ca-file="" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087529 2562 flags.go:64] FLAG: --cloud-config="" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087534 2562 flags.go:64] FLAG: --cloud-provider="external" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087538 2562 flags.go:64] FLAG: --cluster-dns="[]" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087545 2562 flags.go:64] FLAG: --cluster-domain="" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087550 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087555 2562 flags.go:64] FLAG: --config-dir="" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087559 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087565 2562 flags.go:64] FLAG: --container-log-max-files="5" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087571 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087576 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087582 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087587 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087592 2562 flags.go:64] FLAG: --contention-profiling="false" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087597 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087602 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087607 2562 flags.go:64] FLAG: --cpu-manager-policy="none" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087612 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" May 11 20:50:17.092226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087619 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087625 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087630 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087635 2562 flags.go:64] FLAG: --enable-load-reader="false" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087640 2562 flags.go:64] FLAG: --enable-server="true" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087645 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087652 2562 flags.go:64] FLAG: --event-burst="100" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087657 2562 flags.go:64] FLAG: --event-qps="50" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087662 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087668 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087673 2562 flags.go:64] FLAG: --eviction-hard="" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087679 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087685 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087690 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087695 2562 flags.go:64] FLAG: --eviction-soft="" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087700 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087704 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087709 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087713 2562 flags.go:64] FLAG: --experimental-mounter-path="" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087718 2562 flags.go:64] FLAG: --fail-cgroupv1="false" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087723 2562 flags.go:64] FLAG: --fail-swap-on="true" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087728 2562 flags.go:64] FLAG: --feature-gates="" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087733 2562 flags.go:64] FLAG: --file-check-frequency="20s" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087738 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087744 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" May 11 20:50:17.092825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087749 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087754 2562 flags.go:64] FLAG: --healthz-port="10248" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087759 2562 flags.go:64] FLAG: --help="false" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087764 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-135-190.ec2.internal" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087770 2562 flags.go:64] FLAG: --housekeeping-interval="10s" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087775 2562 flags.go:64] FLAG: --http-check-frequency="20s" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087782 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087788 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087794 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087799 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087804 2562 flags.go:64] FLAG: --image-service-endpoint="" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087808 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087813 2562 flags.go:64] FLAG: --kube-api-burst="100" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087818 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087823 2562 flags.go:64] FLAG: --kube-api-qps="50" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087828 2562 flags.go:64] FLAG: --kube-reserved="" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087833 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087837 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087842 2562 flags.go:64] FLAG: --kubelet-cgroups="" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087847 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087851 2562 flags.go:64] FLAG: --lock-file="" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087856 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087861 2562 flags.go:64] FLAG: --log-flush-frequency="5s" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087865 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" May 11 20:50:17.093449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087874 2562 flags.go:64] FLAG: --log-json-split-stream="false" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087879 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087884 2562 flags.go:64] FLAG: --log-text-split-stream="false" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087888 2562 flags.go:64] FLAG: --logging-format="text" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087893 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087902 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087907 2562 flags.go:64] FLAG: --manifest-url="" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087912 2562 flags.go:64] FLAG: --manifest-url-header="" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087919 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087924 2562 flags.go:64] FLAG: --max-open-files="1000000" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087931 2562 flags.go:64] FLAG: --max-pods="110" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087936 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087941 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087945 2562 flags.go:64] FLAG: --memory-manager-policy="None" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087952 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087957 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087962 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087967 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087978 2562 flags.go:64] FLAG: --node-status-max-images="50" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087982 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087987 2562 flags.go:64] FLAG: --oom-score-adj="-999" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087992 2562 flags.go:64] FLAG: --pod-cidr="" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.087997 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3fc6c2cc09f271efd3cd2adb6c984c7cab48ea53dad824c952dee91afa8eaa20" May 11 20:50:17.094084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088022 2562 flags.go:64] FLAG: --pod-manifest-path="" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088027 2562 flags.go:64] FLAG: --pod-max-pids="-1" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088032 2562 flags.go:64] FLAG: --pods-per-core="0" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088036 2562 flags.go:64] FLAG: --port="10250" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088041 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088046 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-055f0bd4df2dfe0b9" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088051 2562 flags.go:64] FLAG: --qos-reserved="" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088056 2562 flags.go:64] FLAG: --read-only-port="10255" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088060 2562 flags.go:64] FLAG: --register-node="true" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088065 2562 flags.go:64] FLAG: --register-schedulable="true" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088070 2562 flags.go:64] FLAG: --register-with-taints="" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088076 2562 flags.go:64] FLAG: --registry-burst="10" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088080 2562 flags.go:64] FLAG: --registry-qps="5" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088085 2562 flags.go:64] FLAG: --reserved-cpus="" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088089 2562 flags.go:64] FLAG: --reserved-memory="" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088097 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088102 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088107 2562 flags.go:64] FLAG: --rotate-certificates="false" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088112 2562 flags.go:64] FLAG: --rotate-server-certificates="false" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088117 2562 flags.go:64] FLAG: --runonce="false" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088121 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088126 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088132 2562 flags.go:64] FLAG: --seccomp-default="false" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088141 2562 flags.go:64] FLAG: --serialize-image-pulls="true" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088146 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088152 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" May 11 20:50:17.094683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088156 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088161 2562 flags.go:64] FLAG: --storage-driver-password="root" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088167 2562 flags.go:64] FLAG: --storage-driver-secure="false" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088172 2562 flags.go:64] FLAG: --storage-driver-table="stats" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088177 2562 flags.go:64] FLAG: --storage-driver-user="root" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088181 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088186 2562 flags.go:64] FLAG: --sync-frequency="1m0s" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088191 2562 flags.go:64] FLAG: --system-cgroups="" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088195 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088204 2562 flags.go:64] FLAG: --system-reserved-cgroup="" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088209 2562 flags.go:64] FLAG: --tls-cert-file="" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088214 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088221 2562 flags.go:64] FLAG: --tls-min-version="" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088225 2562 flags.go:64] FLAG: --tls-private-key-file="" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088230 2562 flags.go:64] FLAG: --topology-manager-policy="none" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088235 2562 flags.go:64] FLAG: --topology-manager-policy-options="" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088239 2562 flags.go:64] FLAG: --topology-manager-scope="container" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088244 2562 flags.go:64] FLAG: --v="2" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088251 2562 flags.go:64] FLAG: --version="false" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088257 2562 flags.go:64] FLAG: --vmodule="" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088264 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088271 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088421 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088428 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:17.095342 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088432 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088437 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088442 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088445 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088452 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088460 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088465 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088469 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088474 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088478 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088483 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088488 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088493 2562 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088498 2562 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088502 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088506 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088511 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088518 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088524 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088529 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:17.095912 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088533 2562 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088537 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088541 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088545 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088549 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088554 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088558 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088562 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088566 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088572 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088576 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088580 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088585 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088589 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088593 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088597 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088601 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088608 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088612 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088616 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:17.096444 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088620 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088624 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088628 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088633 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088638 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088642 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088647 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088651 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088655 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088659 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088663 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088668 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088672 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088676 2562 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088683 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088688 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088692 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088696 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088700 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:17.096940 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088704 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088708 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088712 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088716 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088720 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088724 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088729 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088733 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088736 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088741 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088747 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088751 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088756 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088761 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088765 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088769 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088773 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088777 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088782 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088786 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:17.097426 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088790 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088794 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088798 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088803 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.088807 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.088823 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.096419 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.10" May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.096435 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096482 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096487 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096490 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096493 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096496 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096499 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096502 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096505 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:17.097953 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096507 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096510 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096512 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096515 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096517 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096520 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096522 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096526 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096528 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096531 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096534 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096537 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096540 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096542 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096545 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096547 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096550 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096552 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096555 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:17.098376 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096557 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096560 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096562 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096565 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096569 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096572 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096575 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096578 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096580 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096583 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096586 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096588 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096590 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096593 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096595 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096598 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096600 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096603 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096606 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096608 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:17.098840 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096611 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096614 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096616 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096619 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096621 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096624 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096627 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096629 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096632 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096634 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096637 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096639 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096642 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096645 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096647 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096650 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096654 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096657 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096660 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:17.099340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096663 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096667 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096670 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096673 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096676 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096678 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096680 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096683 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096686 2562 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096690 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096694 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096697 2562 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096700 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096703 2562 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096706 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096708 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096711 2562 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096714 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096717 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:17.099811 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096719 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.096725 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096824 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096828 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096832 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096837 2562 feature_gate.go:328] unrecognized feature gate: Example May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096840 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096842 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096845 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096849 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096852 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096855 2562 feature_gate.go:328] unrecognized feature gate: Example2 May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096858 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096860 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096863 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes May 11 20:50:17.100355 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096866 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096868 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096871 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096874 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096876 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096879 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096881 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096884 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096887 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096889 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096892 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096894 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096897 2562 feature_gate.go:328] unrecognized feature gate: NewOLM May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096899 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096902 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096904 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096907 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096909 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096912 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity May 11 20:50:17.100734 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096914 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096918 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096921 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096924 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096927 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096930 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096932 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096935 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096938 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096941 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096944 2562 feature_gate.go:328] unrecognized feature gate: DualReplica May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096947 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096950 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096952 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096955 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096957 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096960 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096963 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096965 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096967 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification May 11 20:50:17.101241 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096970 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096973 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096975 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096978 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096980 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096982 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096985 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096987 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096990 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096992 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096995 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.096998 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097000 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097003 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097020 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097023 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097026 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097029 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097032 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097034 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud May 11 20:50:17.101732 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097037 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097039 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097042 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097045 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097048 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097051 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097053 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097055 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097058 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097061 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097063 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097066 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097068 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:17.097071 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.097075 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} May 11 20:50:17.102228 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.097831 2562 server.go:962] "Client rotation is on, will bootstrap in background" May 11 20:50:17.102593 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.100594 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" May 11 20:50:17.102593 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.101854 2562 server.go:1019] "Starting client certificate rotation" May 11 20:50:17.102593 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.101950 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" May 11 20:50:17.103050 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.103037 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" May 11 20:50:17.132716 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.132699 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" May 11 20:50:17.135294 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.135281 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" May 11 20:50:17.153442 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.153422 2562 log.go:25] "Validated CRI v1 runtime API" May 11 20:50:17.160362 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.160344 2562 log.go:25] "Validated CRI v1 image API" May 11 20:50:17.162270 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.162252 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 11 20:50:17.164701 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.164684 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" May 11 20:50:17.170453 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.170435 2562 fs.go:135] Filesystem UUIDs: map[39cc1051-24f8-4e3b-9821-f82f3b39e0d7:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8b56cda5-e45e-4158-a7da-3766d280d022:/dev/nvme0n1p3] May 11 20:50:17.170517 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.170454 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] May 11 20:50:17.176562 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.176458 2562 manager.go:217] Machine: {Timestamp:2026-05-11 20:50:17.174102177 +0000 UTC m=+0.472246879 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3106150 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2df79d904e54beea18073447c87860 SystemUUID:ec2df79d-904e-54be-ea18-073447c87860 BootID:fddcb74f-a563-4cb3-8b0b-bc0784a79298 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d8:1d:6b:a3:c5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d8:1d:6b:a3:c5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:36:61:c6:c1:69:9d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} May 11 20:50:17.176562 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.176557 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. May 11 20:50:17.176668 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.176631 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.112.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260504-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} May 11 20:50:17.177980 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.177960 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 11 20:50:17.178132 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.177983 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-190.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 11 20:50:17.178174 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.178141 2562 topology_manager.go:138] "Creating topology manager with none policy" May 11 20:50:17.178174 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.178150 2562 container_manager_linux.go:306] "Creating device plugin manager" May 11 20:50:17.178174 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.178162 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" May 11 20:50:17.179848 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.179838 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" May 11 20:50:17.181467 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.181457 2562 state_mem.go:36] "Initialized new in-memory state store" May 11 20:50:17.181586 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.181577 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" May 11 20:50:17.184605 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.184596 2562 kubelet.go:491] "Attempting to sync node with API server" May 11 20:50:17.184647 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.184609 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" May 11 20:50:17.184647 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.184624 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" May 11 20:50:17.184647 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.184633 2562 kubelet.go:397] "Adding apiserver pod source" May 11 20:50:17.184647 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.184646 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 11 20:50:17.185944 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.185931 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" May 11 20:50:17.185991 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.185950 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" May 11 20:50:17.189618 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.189600 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.11-2.rhaos4.20.gitb2a8320.el9" apiVersion="v1" May 11 20:50:17.191058 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.191041 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 11 20:50:17.193303 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193288 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" May 11 20:50:17.193376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193308 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" May 11 20:50:17.193376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193316 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" May 11 20:50:17.193376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193324 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" May 11 20:50:17.193376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193332 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" May 11 20:50:17.193376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193342 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" May 11 20:50:17.193376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193351 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" May 11 20:50:17.193376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193360 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" May 11 20:50:17.193376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193371 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" May 11 20:50:17.193376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193380 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" May 11 20:50:17.193653 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193392 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" May 11 20:50:17.193653 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.193406 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" May 11 20:50:17.195662 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.195651 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" May 11 20:50:17.195723 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.195664 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" May 11 20:50:17.198991 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.198951 2562 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-190.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope May 11 20:50:17.199209 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.199196 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 11 20:50:17.199267 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.199244 2562 server.go:1295] "Started kubelet" May 11 20:50:17.199336 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.199314 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 11 20:50:17.199724 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.199682 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 11 20:50:17.199790 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.199744 2562 server_v1.go:47] "podresources" method="list" useActivePods=true May 11 20:50:17.200001 ip-10-0-135-190 systemd[1]: Started Kubernetes Kubelet. May 11 20:50:17.200653 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.199765 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-190.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 11 20:50:17.200834 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.200732 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 11 20:50:17.201002 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.200985 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 11 20:50:17.201525 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.201497 2562 server.go:317] "Adding debug handlers to kubelet server" May 11 20:50:17.204908 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.204890 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-slbrh" May 11 20:50:17.208474 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.207376 2562 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-190.ec2.internal.18ae9dfe059acb9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-190.ec2.internal,UID:ip-10-0-135-190.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-190.ec2.internal,},FirstTimestamp:2026-05-11 20:50:17.19920937 +0000 UTC m=+0.497354073,LastTimestamp:2026-05-11 20:50:17.19920937 +0000 UTC m=+0.497354073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-190.ec2.internal,}" May 11 20:50:17.210718 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.210694 2562 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" May 11 20:50:17.210959 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.210942 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 11 20:50:17.211076 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.210958 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" May 11 20:50:17.211623 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211604 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory May 11 20:50:17.211623 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211620 2562 factory.go:55] Registering systemd factory May 11 20:50:17.211766 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211635 2562 factory.go:223] Registration of the systemd container factory successfully May 11 20:50:17.211766 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211697 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 11 20:50:17.211766 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211699 2562 volume_manager.go:295] "The desired_state_of_world populator starts" May 11 20:50:17.211766 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211737 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" May 11 20:50:17.211939 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211823 2562 factory.go:153] Registering CRI-O factory May 11 20:50:17.211939 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211837 2562 factory.go:223] Registration of the crio container factory successfully May 11 20:50:17.211939 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211827 2562 reconstruct.go:97] "Volume reconstruction finished" May 11 20:50:17.211939 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211856 2562 reconciler.go:26] "Reconciler: start to sync state" May 11 20:50:17.211939 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211861 2562 factory.go:103] Registering Raw factory May 11 20:50:17.211939 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.211877 2562 manager.go:1196] Started watching for new ooms in manager May 11 20:50:17.211939 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.211925 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:17.212604 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.212588 2562 manager.go:319] Starting recovery of all containers May 11 20:50:17.212943 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.212927 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-slbrh" May 11 20:50:17.222091 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.222071 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:17.223278 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.223261 2562 manager.go:324] Recovery completed May 11 20:50:17.224801 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.224779 2562 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-190.ec2.internal\" not found" node="ip-10-0-135-190.ec2.internal" May 11 20:50:17.227599 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.227587 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:17.229732 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.229716 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:17.229793 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.229741 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:17.229793 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.229751 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:17.230248 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.230234 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" May 11 20:50:17.230294 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.230250 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" May 11 20:50:17.230294 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.230269 2562 state_mem.go:36] "Initialized new in-memory state store" May 11 20:50:17.233422 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.233409 2562 policy_none.go:49] "None policy: Start" May 11 20:50:17.233463 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.233426 2562 memory_manager.go:186] "Starting memorymanager" policy="None" May 11 20:50:17.233463 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.233436 2562 state_mem.go:35] "Initializing new in-memory state store" May 11 20:50:17.269625 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.269606 2562 manager.go:341] "Starting Device Plugin manager" May 11 20:50:17.291493 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.269641 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 11 20:50:17.291493 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.269651 2562 server.go:85] "Starting device plugin registration server" May 11 20:50:17.291493 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.269886 2562 eviction_manager.go:189] "Eviction manager: starting control loop" May 11 20:50:17.291493 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.269898 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 11 20:50:17.291493 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.270051 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" May 11 20:50:17.291493 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.270126 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" May 11 20:50:17.291493 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.270136 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 11 20:50:17.291493 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.270591 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" May 11 20:50:17.291493 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.270634 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:17.308051 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.308022 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 11 20:50:17.309106 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.309086 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 11 20:50:17.309106 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.309107 2562 status_manager.go:230] "Starting to sync pod status with apiserver" May 11 20:50:17.309230 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.309121 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 11 20:50:17.309230 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.309127 2562 kubelet.go:2451] "Starting kubelet main sync loop" May 11 20:50:17.309230 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.309156 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" May 11 20:50:17.311241 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.311223 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:17.370181 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.370120 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:17.371154 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.371128 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:17.371232 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.371162 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:17.371232 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.371177 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:17.371232 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.371206 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-190.ec2.internal" May 11 20:50:17.380812 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.380795 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-190.ec2.internal" May 11 20:50:17.380903 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.380819 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-190.ec2.internal\": node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:17.397677 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.397660 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:17.409949 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.409930 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal"] May 11 20:50:17.410001 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.409987 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:17.411306 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.411294 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:17.411373 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.411316 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:17.411373 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.411326 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:17.412491 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.412476 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8cacb3f518c3d6c299fcc831d5c18600-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal\" (UID: \"8cacb3f518c3d6c299fcc831d5c18600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" May 11 20:50:17.412542 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.412499 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cacb3f518c3d6c299fcc831d5c18600-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal\" (UID: \"8cacb3f518c3d6c299fcc831d5c18600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" May 11 20:50:17.413640 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.413629 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:17.413813 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.413797 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" May 11 20:50:17.413813 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.413833 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:17.414326 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.414310 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:17.414407 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.414340 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:17.414407 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.414354 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:17.414407 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.414311 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:17.414556 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.414417 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:17.414556 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.414426 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:17.416482 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.416467 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal" May 11 20:50:17.416547 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.416495 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" May 11 20:50:17.417202 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.417188 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientMemory" May 11 20:50:17.417262 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.417215 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasNoDiskPressure" May 11 20:50:17.417262 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.417230 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeHasSufficientPID" May 11 20:50:17.437565 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.437548 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-190.ec2.internal\" not found" node="ip-10-0-135-190.ec2.internal" May 11 20:50:17.441073 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.441058 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-190.ec2.internal\" not found" node="ip-10-0-135-190.ec2.internal" May 11 20:50:17.497858 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.497837 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:17.513450 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.513432 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8cacb3f518c3d6c299fcc831d5c18600-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal\" (UID: \"8cacb3f518c3d6c299fcc831d5c18600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" May 11 20:50:17.513537 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.513454 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cacb3f518c3d6c299fcc831d5c18600-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal\" (UID: \"8cacb3f518c3d6c299fcc831d5c18600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" May 11 20:50:17.513537 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.513474 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c5f0e82f4ac1559ad6c0ea2fd6d8dd2a-config\") pod \"kube-apiserver-proxy-ip-10-0-135-190.ec2.internal\" (UID: \"c5f0e82f4ac1559ad6c0ea2fd6d8dd2a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal" May 11 20:50:17.513613 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.513542 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8cacb3f518c3d6c299fcc831d5c18600-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal\" (UID: \"8cacb3f518c3d6c299fcc831d5c18600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" May 11 20:50:17.513613 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.513541 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8cacb3f518c3d6c299fcc831d5c18600-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal\" (UID: \"8cacb3f518c3d6c299fcc831d5c18600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" May 11 20:50:17.598710 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.598675 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:17.614053 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.614004 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c5f0e82f4ac1559ad6c0ea2fd6d8dd2a-config\") pod \"kube-apiserver-proxy-ip-10-0-135-190.ec2.internal\" (UID: \"c5f0e82f4ac1559ad6c0ea2fd6d8dd2a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal" May 11 20:50:17.614151 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.614070 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c5f0e82f4ac1559ad6c0ea2fd6d8dd2a-config\") pod \"kube-apiserver-proxy-ip-10-0-135-190.ec2.internal\" (UID: \"c5f0e82f4ac1559ad6c0ea2fd6d8dd2a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal" May 11 20:50:17.699484 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.699418 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:17.740952 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.740930 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" May 11 20:50:17.743666 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:17.743652 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal" May 11 20:50:17.800418 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.800394 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:17.900924 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:17.900896 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:18.001422 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:18.001354 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:18.100892 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.100859 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" May 11 20:50:18.101410 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.100990 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" May 11 20:50:18.101410 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.101056 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" May 11 20:50:18.101949 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:18.101925 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:18.202381 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:18.202354 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:18.211318 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.211296 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" May 11 20:50:18.215294 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.215261 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-05-10 20:45:17 +0000 UTC" deadline="2028-01-14 10:19:41.982087256 +0000 UTC" May 11 20:50:18.215294 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.215292 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14701h29m23.766797997s" May 11 20:50:18.221088 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.221069 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" May 11 20:50:18.243940 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.243922 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-t4qx8" May 11 20:50:18.249277 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.249254 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-t4qx8" May 11 20:50:18.302723 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:18.302662 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-190.ec2.internal\" not found" May 11 20:50:18.350872 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.350844 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:18.368746 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:18.368720 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cacb3f518c3d6c299fcc831d5c18600.slice/crio-6040734a11c6910dfaed1834f091f050f5f4c9f0a6a63681c5b4150b2722a07b WatchSource:0}: Error finding container 6040734a11c6910dfaed1834f091f050f5f4c9f0a6a63681c5b4150b2722a07b: Status 404 returned error can't find the container with id 6040734a11c6910dfaed1834f091f050f5f4c9f0a6a63681c5b4150b2722a07b May 11 20:50:18.369033 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:18.368998 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5f0e82f4ac1559ad6c0ea2fd6d8dd2a.slice/crio-a305cdd44f920f21c3fe6699e05dbdb4b63803be7469f3db1af9b7a2851db673 WatchSource:0}: Error finding container a305cdd44f920f21c3fe6699e05dbdb4b63803be7469f3db1af9b7a2851db673: Status 404 returned error can't find the container with id a305cdd44f920f21c3fe6699e05dbdb4b63803be7469f3db1af9b7a2851db673 May 11 20:50:18.372063 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.372051 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 20:50:18.411310 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.411277 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" May 11 20:50:18.424388 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.424369 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 11 20:50:18.426625 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.426612 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal" May 11 20:50:18.434398 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.434383 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 11 20:50:18.790718 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:18.790650 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:19.185715 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.185684 2562 apiserver.go:52] "Watching apiserver" May 11 20:50:19.193568 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.193545 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" May 11 20:50:19.194738 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.194709 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zvbm7","openshift-network-diagnostics/network-check-target-xw5qw","kube-system/konnectivity-agent-grk72","kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal","openshift-multus/multus-additional-cni-plugins-m4bgl","openshift-multus/network-metrics-daemon-2ccqq","openshift-network-operator/iptables-alerter-4dgqm","openshift-ovn-kubernetes/ovnkube-node-knpxn","openshift-cluster-node-tuning-operator/tuned-rlkkm","openshift-image-registry/node-ca-p6nhn"] May 11 20:50:19.197037 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.196997 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zvbm7" May 11 20:50:19.199386 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.199257 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:19.199386 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.199341 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:19.199386 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.199382 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" May 11 20:50:19.199590 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.199377 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" May 11 20:50:19.199668 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.199652 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" May 11 20:50:19.199725 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.199654 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" May 11 20:50:19.199810 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.199792 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pvrw2\"" May 11 20:50:19.201671 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.201509 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-grk72" May 11 20:50:19.203701 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.203683 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" May 11 20:50:19.203792 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.203727 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9lrst\"" May 11 20:50:19.203857 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.203846 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.203961 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.203945 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" May 11 20:50:19.206246 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.206219 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" May 11 20:50:19.206359 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.206275 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" May 11 20:50:19.206359 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.206319 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" May 11 20:50:19.206484 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.206400 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qjbs4\"" May 11 20:50:19.213267 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.213248 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.215725 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.215707 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:19.215833 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.215776 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:19.216100 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.216074 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" May 11 20:50:19.216287 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.216263 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2hqlf\"" May 11 20:50:19.216388 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.216373 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" May 11 20:50:19.217967 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.217941 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.220368 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.220248 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.220697 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.220669 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" May 11 20:50:19.221573 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.221359 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" May 11 20:50:19.221573 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.221487 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.221954 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kwdtn\"" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.222936 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.222945 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223174 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223203 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2dzzq\"" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223384 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223441 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223452 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-etc-selinux\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223512 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-cnibin\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223537 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223538 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-var-lib-cni-multus\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223611 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbqfr\" (UniqueName: \"kubernetes.io/projected/bd93503a-1025-486b-be75-d191d6bd581d-kube-api-access-tbqfr\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.223693 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223679 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-system-cni-dir\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.224380 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223744 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96cb7513-d136-4d23-90a5-47ea1604bb7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.224380 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223776 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96cb7513-d136-4d23-90a5-47ea1604bb7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.224380 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223833 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/96cb7513-d136-4d23-90a5-47ea1604bb7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.224380 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223892 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-run-k8s-cni-cncf-io\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224380 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.223948 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-var-lib-kubelet\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224380 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224093 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c19352cd-f3ce-49f5-99aa-571926768a56-konnectivity-ca\") pod \"konnectivity-agent-grk72\" (UID: \"c19352cd-f3ce-49f5-99aa-571926768a56\") " pod="kube-system/konnectivity-agent-grk72" May 11 20:50:19.224380 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224167 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-hostroot\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224380 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224231 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-run-multus-certs\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224380 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224302 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tvs7\" (UniqueName: \"kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7\") pod \"network-check-target-xw5qw\" (UID: \"bae2e16d-3454-4522-88aa-1afafb2e9cb1\") " pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:19.224380 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224346 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gcbt\" (UniqueName: \"kubernetes.io/projected/96cb7513-d136-4d23-90a5-47ea1604bb7b-kube-api-access-2gcbt\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224386 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-cni-dir\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224414 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-run-netns\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224444 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-cnibin\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224475 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/037f8bf8-dffb-4ab0-806a-d440b0092789-cni-binary-copy\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224521 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c19352cd-f3ce-49f5-99aa-571926768a56-agent-certs\") pod \"konnectivity-agent-grk72\" (UID: \"c19352cd-f3ce-49f5-99aa-571926768a56\") " pod="kube-system/konnectivity-agent-grk72" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224558 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-socket-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224583 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-device-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224613 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-var-lib-cni-bin\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224641 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9m55\" (UniqueName: \"kubernetes.io/projected/037f8bf8-dffb-4ab0-806a-d440b0092789-kube-api-access-f9m55\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224669 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-sys-fs\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224697 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224723 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-conf-dir\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224752 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-os-release\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224781 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-system-cni-dir\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.224819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224819 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-daemon-config\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.225531 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224849 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-etc-kubernetes\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.225531 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224879 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-os-release\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.225531 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.224971 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-socket-dir-parent\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.225531 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.225017 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.225531 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.225286 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.225781 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.225694 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.225781 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.225733 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-registration-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.228249 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.228194 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wr8jp\"" May 11 20:50:19.228249 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.228216 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qwj2j\"" May 11 20:50:19.228384 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.228220 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" May 11 20:50:19.228697 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.228453 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:19.228697 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.228535 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" May 11 20:50:19.228697 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.228631 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" May 11 20:50:19.230398 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.230070 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" May 11 20:50:19.249939 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.249909 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-05-10 20:45:18 +0000 UTC" deadline="2028-01-30 06:07:40.56384149 +0000 UTC" May 11 20:50:19.249939 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.249938 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15081h17m21.313907153s" May 11 20:50:19.312592 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.312540 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 11 20:50:19.313803 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.313766 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" event={"ID":"8cacb3f518c3d6c299fcc831d5c18600","Type":"ContainerStarted","Data":"6040734a11c6910dfaed1834f091f050f5f4c9f0a6a63681c5b4150b2722a07b"} May 11 20:50:19.314879 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.314856 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal" event={"ID":"c5f0e82f4ac1559ad6c0ea2fd6d8dd2a","Type":"ContainerStarted","Data":"a305cdd44f920f21c3fe6699e05dbdb4b63803be7469f3db1af9b7a2851db673"} May 11 20:50:19.326191 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326169 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-host\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.326300 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326204 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a64fefbb-edf9-4ffa-adf6-0602e2c7e71b-serviceca\") pod \"node-ca-p6nhn\" (UID: \"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b\") " pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.326300 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326237 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c19352cd-f3ce-49f5-99aa-571926768a56-konnectivity-ca\") pod \"konnectivity-agent-grk72\" (UID: \"c19352cd-f3ce-49f5-99aa-571926768a56\") " pod="kube-system/konnectivity-agent-grk72" May 11 20:50:19.326300 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326261 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-hostroot\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.326300 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326286 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-cni-bin\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.326498 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326311 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c60e645-398a-4781-9df4-1e5322dfe01e-env-overrides\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.326498 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326334 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74173c0d-c7f0-494f-86f5-6992b03a83a1-tmp\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.326498 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326342 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-hostroot\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.326498 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326356 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a64fefbb-edf9-4ffa-adf6-0602e2c7e71b-host\") pod \"node-ca-p6nhn\" (UID: \"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b\") " pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.326498 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326383 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvs7\" (UniqueName: \"kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7\") pod \"network-check-target-xw5qw\" (UID: \"bae2e16d-3454-4522-88aa-1afafb2e9cb1\") " pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:19.326498 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326436 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gcbt\" (UniqueName: \"kubernetes.io/projected/96cb7513-d136-4d23-90a5-47ea1604bb7b-kube-api-access-2gcbt\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.326498 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326463 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-run-netns\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.326498 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326490 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjhgm\" (UniqueName: \"kubernetes.io/projected/3be5f296-2151-4f3e-b028-c72728d855da-kube-api-access-mjhgm\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326516 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-systemd-units\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326543 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-run-netns\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326571 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-cni-netd\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326574 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-run-netns\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326596 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/037f8bf8-dffb-4ab0-806a-d440b0092789-cni-binary-copy\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326637 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c19352cd-f3ce-49f5-99aa-571926768a56-agent-certs\") pod \"konnectivity-agent-grk72\" (UID: \"c19352cd-f3ce-49f5-99aa-571926768a56\") " pod="kube-system/konnectivity-agent-grk72" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326667 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-device-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326704 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-var-lib-cni-bin\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326731 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-run-openvswitch\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326768 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326772 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-device-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326762 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-var-lib-cni-bin\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326791 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c19352cd-f3ce-49f5-99aa-571926768a56-konnectivity-ca\") pod \"konnectivity-agent-grk72\" (UID: \"c19352cd-f3ce-49f5-99aa-571926768a56\") " pod="kube-system/konnectivity-agent-grk72" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326794 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-conf-dir\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326836 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-tuned\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.326855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326833 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-conf-dir\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326870 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-os-release\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326898 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-etc-kubernetes\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326926 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c60e645-398a-4781-9df4-1e5322dfe01e-ovnkube-script-lib\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326931 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326953 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-socket-dir-parent\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326956 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-etc-kubernetes\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326953 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-os-release\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.326980 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-etc-selinux\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327019 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327056 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c60e645-398a-4781-9df4-1e5322dfe01e-ovnkube-config\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327063 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-socket-dir-parent\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327079 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbqfr\" (UniqueName: \"kubernetes.io/projected/bd93503a-1025-486b-be75-d191d6bd581d-kube-api-access-tbqfr\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327095 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-run\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327116 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-run-k8s-cni-cncf-io\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327142 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/037f8bf8-dffb-4ab0-806a-d440b0092789-cni-binary-copy\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327157 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-etc-selinux\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.327565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327170 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-run-k8s-cni-cncf-io\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327233 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f4142e8-fca7-4d5f-aecf-381a5629861f-host-slash\") pod \"iptables-alerter-4dgqm\" (UID: \"4f4142e8-fca7-4d5f-aecf-381a5629861f\") " pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327260 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-kubelet\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327286 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-run-ovn\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327315 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-run-multus-certs\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327350 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-node-log\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327392 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-kubernetes\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327403 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327419 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-run-multus-certs\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327492 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-sysctl-d\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327516 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-lib-modules\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327554 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9b6x\" (UniqueName: \"kubernetes.io/projected/a64fefbb-edf9-4ffa-adf6-0602e2c7e71b-kube-api-access-j9b6x\") pod \"node-ca-p6nhn\" (UID: \"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b\") " pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327568 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-log-socket\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327591 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-cni-dir\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327617 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f4142e8-fca7-4d5f-aecf-381a5629861f-iptables-alerter-script\") pod \"iptables-alerter-4dgqm\" (UID: \"4f4142e8-fca7-4d5f-aecf-381a5629861f\") " pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.327641 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-slash\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328032 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c60e645-398a-4781-9df4-1e5322dfe01e-ovn-node-metrics-cert\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.328250 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328073 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-sysctl-conf\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328090 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-cni-dir\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328119 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-systemd\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328181 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-cnibin\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328233 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-socket-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328279 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-cnibin\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328287 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9m55\" (UniqueName: \"kubernetes.io/projected/037f8bf8-dffb-4ab0-806a-d440b0092789-kube-api-access-f9m55\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328321 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-var-lib-kubelet\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328371 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-sys-fs\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328397 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-socket-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328534 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-var-lib-openvswitch\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328571 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328612 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-system-cni-dir\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328642 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-daemon-config\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328674 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz2gs\" (UniqueName: \"kubernetes.io/projected/4f4142e8-fca7-4d5f-aecf-381a5629861f-kube-api-access-vz2gs\") pod \"iptables-alerter-4dgqm\" (UID: \"4f4142e8-fca7-4d5f-aecf-381a5629861f\") " pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328706 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-run-systemd\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328739 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-modprobe-d\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.328953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328766 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-sysconfig\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328800 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-os-release\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328836 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328869 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-registration-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328902 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-cnibin\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328931 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-var-lib-cni-multus\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.328974 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9mvs\" (UniqueName: \"kubernetes.io/projected/74173c0d-c7f0-494f-86f5-6992b03a83a1-kube-api-access-z9mvs\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329026 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-system-cni-dir\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329063 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96cb7513-d136-4d23-90a5-47ea1604bb7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329095 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96cb7513-d136-4d23-90a5-47ea1604bb7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329115 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-system-cni-dir\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329127 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-run-ovn-kubernetes\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329135 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-os-release\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329157 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/96cb7513-d136-4d23-90a5-47ea1604bb7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329194 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-sys-fs\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329212 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-var-lib-kubelet\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329240 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.329655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329256 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-etc-openvswitch\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.330397 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329306 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-var-lib-kubelet\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.330397 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329313 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd93503a-1025-486b-be75-d191d6bd581d-registration-dir\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.330397 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329316 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/037f8bf8-dffb-4ab0-806a-d440b0092789-host-var-lib-cni-multus\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.330397 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329360 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-cnibin\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.330397 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.329855 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96cb7513-d136-4d23-90a5-47ea1604bb7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.330397 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.330274 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/96cb7513-d136-4d23-90a5-47ea1604bb7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.330397 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.330325 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5sxv\" (UniqueName: \"kubernetes.io/projected/9c60e645-398a-4781-9df4-1e5322dfe01e-kube-api-access-x5sxv\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.330397 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.330358 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-sys\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.331723 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.330459 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96cb7513-d136-4d23-90a5-47ea1604bb7b-system-cni-dir\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.331723 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.330852 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96cb7513-d136-4d23-90a5-47ea1604bb7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.331723 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.330961 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/037f8bf8-dffb-4ab0-806a-d440b0092789-multus-daemon-config\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.331723 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.331270 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c19352cd-f3ce-49f5-99aa-571926768a56-agent-certs\") pod \"konnectivity-agent-grk72\" (UID: \"c19352cd-f3ce-49f5-99aa-571926768a56\") " pod="kube-system/konnectivity-agent-grk72" May 11 20:50:19.337576 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.337555 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:19.337721 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.337580 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:19.337721 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.337595 2562 projected.go:194] Error preparing data for projected volume kube-api-access-6tvs7 for pod openshift-network-diagnostics/network-check-target-xw5qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:19.337721 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.337670 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7 podName:bae2e16d-3454-4522-88aa-1afafb2e9cb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:19.837645701 +0000 UTC m=+3.135790399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6tvs7" (UniqueName: "kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7") pod "network-check-target-xw5qw" (UID: "bae2e16d-3454-4522-88aa-1afafb2e9cb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:19.339946 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.339922 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gcbt\" (UniqueName: \"kubernetes.io/projected/96cb7513-d136-4d23-90a5-47ea1604bb7b-kube-api-access-2gcbt\") pod \"multus-additional-cni-plugins-m4bgl\" (UID: \"96cb7513-d136-4d23-90a5-47ea1604bb7b\") " pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.340140 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.340111 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9m55\" (UniqueName: \"kubernetes.io/projected/037f8bf8-dffb-4ab0-806a-d440b0092789-kube-api-access-f9m55\") pod \"multus-zvbm7\" (UID: \"037f8bf8-dffb-4ab0-806a-d440b0092789\") " pod="openshift-multus/multus-zvbm7" May 11 20:50:19.340346 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.340329 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbqfr\" (UniqueName: \"kubernetes.io/projected/bd93503a-1025-486b-be75-d191d6bd581d-kube-api-access-tbqfr\") pod \"aws-ebs-csi-driver-node-9v4h9\" (UID: \"bd93503a-1025-486b-be75-d191d6bd581d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.366423 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.366398 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:19.430663 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430633 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f4142e8-fca7-4d5f-aecf-381a5629861f-host-slash\") pod \"iptables-alerter-4dgqm\" (UID: \"4f4142e8-fca7-4d5f-aecf-381a5629861f\") " pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.430805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430674 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-kubelet\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.430805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430698 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-run-ovn\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.430805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430723 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-node-log\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.430805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430747 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-kubernetes\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.430805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430749 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4f4142e8-fca7-4d5f-aecf-381a5629861f-host-slash\") pod \"iptables-alerter-4dgqm\" (UID: \"4f4142e8-fca7-4d5f-aecf-381a5629861f\") " pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.430805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430759 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-run-ovn\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.430805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430771 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-sysctl-d\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.430805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430749 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-kubelet\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.430805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430784 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-node-log\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.430805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430808 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-lib-modules\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430827 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-kubernetes\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430841 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j9b6x\" (UniqueName: \"kubernetes.io/projected/a64fefbb-edf9-4ffa-adf6-0602e2c7e71b-kube-api-access-j9b6x\") pod \"node-ca-p6nhn\" (UID: \"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b\") " pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430868 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-log-socket\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430895 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f4142e8-fca7-4d5f-aecf-381a5629861f-iptables-alerter-script\") pod \"iptables-alerter-4dgqm\" (UID: \"4f4142e8-fca7-4d5f-aecf-381a5629861f\") " pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430907 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-sysctl-d\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430918 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-slash\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430931 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-lib-modules\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430943 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c60e645-398a-4781-9df4-1e5322dfe01e-ovn-node-metrics-cert\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430950 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-log-socket\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430966 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-sysctl-conf\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430989 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-slash\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.430992 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-systemd\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431026 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-var-lib-kubelet\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431046 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-var-lib-openvswitch\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431064 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431081 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vz2gs\" (UniqueName: \"kubernetes.io/projected/4f4142e8-fca7-4d5f-aecf-381a5629861f-kube-api-access-vz2gs\") pod \"iptables-alerter-4dgqm\" (UID: \"4f4142e8-fca7-4d5f-aecf-381a5629861f\") " pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.431222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431096 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-run-systemd\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431113 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-modprobe-d\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431136 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-sysconfig\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431184 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-var-lib-openvswitch\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431204 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-sysconfig\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431222 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9mvs\" (UniqueName: \"kubernetes.io/projected/74173c0d-c7f0-494f-86f5-6992b03a83a1-kube-api-access-z9mvs\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431247 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431256 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-run-ovn-kubernetes\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431286 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-etc-openvswitch\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431311 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5sxv\" (UniqueName: \"kubernetes.io/projected/9c60e645-398a-4781-9df4-1e5322dfe01e-kube-api-access-x5sxv\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431339 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-sys\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431363 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-host\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431368 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-var-lib-kubelet\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431385 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a64fefbb-edf9-4ffa-adf6-0602e2c7e71b-serviceca\") pod \"node-ca-p6nhn\" (UID: \"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b\") " pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431414 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-cni-bin\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431438 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c60e645-398a-4781-9df4-1e5322dfe01e-env-overrides\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431456 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4f4142e8-fca7-4d5f-aecf-381a5629861f-iptables-alerter-script\") pod \"iptables-alerter-4dgqm\" (UID: \"4f4142e8-fca7-4d5f-aecf-381a5629861f\") " pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.431965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431462 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74173c0d-c7f0-494f-86f5-6992b03a83a1-tmp\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431484 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a64fefbb-edf9-4ffa-adf6-0602e2c7e71b-host\") pod \"node-ca-p6nhn\" (UID: \"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b\") " pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431487 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-run-systemd\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431509 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-sysctl-conf\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431521 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-sys\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431528 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjhgm\" (UniqueName: \"kubernetes.io/projected/3be5f296-2151-4f3e-b028-c72728d855da-kube-api-access-mjhgm\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431558 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-systemd\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431572 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-modprobe-d\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431606 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-etc-openvswitch\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431614 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-run-ovn-kubernetes\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431661 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a64fefbb-edf9-4ffa-adf6-0602e2c7e71b-host\") pod \"node-ca-p6nhn\" (UID: \"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b\") " pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431669 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-systemd-units\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431672 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-host\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431777 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-cni-bin\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431808 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-run-netns\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431834 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-cni-netd\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431879 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-cni-netd\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431879 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-systemd-units\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.432795 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431900 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-run-openvswitch\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431923 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-host-run-netns\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431928 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-tuned\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431955 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c60e645-398a-4781-9df4-1e5322dfe01e-ovnkube-script-lib\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.431982 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.432025 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c60e645-398a-4781-9df4-1e5322dfe01e-ovnkube-config\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.432049 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-run\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.432121 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/74173c0d-c7f0-494f-86f5-6992b03a83a1-run\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.432128 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c60e645-398a-4781-9df4-1e5322dfe01e-env-overrides\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.432241 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.432310 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs podName:3be5f296-2151-4f3e-b028-c72728d855da nodeName:}" failed. No retries permitted until 2026-05-11 20:50:19.932291746 +0000 UTC m=+3.230436453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs") pod "network-metrics-daemon-2ccqq" (UID: "3be5f296-2151-4f3e-b028-c72728d855da") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.432331 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a64fefbb-edf9-4ffa-adf6-0602e2c7e71b-serviceca\") pod \"node-ca-p6nhn\" (UID: \"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b\") " pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.432384 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c60e645-398a-4781-9df4-1e5322dfe01e-run-openvswitch\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.432636 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c60e645-398a-4781-9df4-1e5322dfe01e-ovnkube-script-lib\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.433405 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.432779 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c60e645-398a-4781-9df4-1e5322dfe01e-ovnkube-config\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.434242 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.434221 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c60e645-398a-4781-9df4-1e5322dfe01e-ovn-node-metrics-cert\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.434358 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.434238 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/74173c0d-c7f0-494f-86f5-6992b03a83a1-etc-tuned\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.434443 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.434428 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/74173c0d-c7f0-494f-86f5-6992b03a83a1-tmp\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.438955 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.438932 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz2gs\" (UniqueName: \"kubernetes.io/projected/4f4142e8-fca7-4d5f-aecf-381a5629861f-kube-api-access-vz2gs\") pod \"iptables-alerter-4dgqm\" (UID: \"4f4142e8-fca7-4d5f-aecf-381a5629861f\") " pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.439277 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.439257 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9b6x\" (UniqueName: \"kubernetes.io/projected/a64fefbb-edf9-4ffa-adf6-0602e2c7e71b-kube-api-access-j9b6x\") pod \"node-ca-p6nhn\" (UID: \"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b\") " pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.439360 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.439259 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9mvs\" (UniqueName: \"kubernetes.io/projected/74173c0d-c7f0-494f-86f5-6992b03a83a1-kube-api-access-z9mvs\") pod \"tuned-rlkkm\" (UID: \"74173c0d-c7f0-494f-86f5-6992b03a83a1\") " pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.439416 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.439402 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5sxv\" (UniqueName: \"kubernetes.io/projected/9c60e645-398a-4781-9df4-1e5322dfe01e-kube-api-access-x5sxv\") pod \"ovnkube-node-knpxn\" (UID: \"9c60e645-398a-4781-9df4-1e5322dfe01e\") " pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.440484 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.440463 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjhgm\" (UniqueName: \"kubernetes.io/projected/3be5f296-2151-4f3e-b028-c72728d855da-kube-api-access-mjhgm\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:19.517561 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.517528 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zvbm7" May 11 20:50:19.526138 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.526112 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-grk72" May 11 20:50:19.534807 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.534786 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" May 11 20:50:19.539359 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.539341 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" May 11 20:50:19.545901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.545882 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4dgqm" May 11 20:50:19.552466 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.552449 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:19.558962 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.558936 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p6nhn" May 11 20:50:19.564505 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.564460 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" May 11 20:50:19.699604 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.699570 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" May 11 20:50:19.934559 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.934478 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvs7\" (UniqueName: \"kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7\") pod \"network-check-target-xw5qw\" (UID: \"bae2e16d-3454-4522-88aa-1afafb2e9cb1\") " pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:19.934559 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:19.934533 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:19.934743 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.934622 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:19.934743 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.934637 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:19.934743 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.934654 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:19.934743 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.934665 2562 projected.go:194] Error preparing data for projected volume kube-api-access-6tvs7 for pod openshift-network-diagnostics/network-check-target-xw5qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:19.934743 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.934683 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs podName:3be5f296-2151-4f3e-b028-c72728d855da nodeName:}" failed. No retries permitted until 2026-05-11 20:50:20.934665794 +0000 UTC m=+4.232810490 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs") pod "network-metrics-daemon-2ccqq" (UID: "3be5f296-2151-4f3e-b028-c72728d855da") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:19.934743 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:19.934700 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7 podName:bae2e16d-3454-4522-88aa-1afafb2e9cb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:20.934690192 +0000 UTC m=+4.232834895 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tvs7" (UniqueName: "kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7") pod "network-check-target-xw5qw" (UID: "bae2e16d-3454-4522-88aa-1afafb2e9cb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:20.034856 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:20.034829 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96cb7513_d136_4d23_90a5_47ea1604bb7b.slice/crio-1fcf786d9e3ecacbaf3abc663fb3d3a517171b62fb52df303c3a65cad0b2cdda WatchSource:0}: Error finding container 1fcf786d9e3ecacbaf3abc663fb3d3a517171b62fb52df303c3a65cad0b2cdda: Status 404 returned error can't find the container with id 1fcf786d9e3ecacbaf3abc663fb3d3a517171b62fb52df303c3a65cad0b2cdda May 11 20:50:20.035873 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:20.035849 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74173c0d_c7f0_494f_86f5_6992b03a83a1.slice/crio-2e56b72d47fbdf47dae3c82374953e492e451a017c02536e37977bde6afa5885 WatchSource:0}: Error finding container 2e56b72d47fbdf47dae3c82374953e492e451a017c02536e37977bde6afa5885: Status 404 returned error can't find the container with id 2e56b72d47fbdf47dae3c82374953e492e451a017c02536e37977bde6afa5885 May 11 20:50:20.037717 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:20.037408 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19352cd_f3ce_49f5_99aa_571926768a56.slice/crio-bd1c3e3f17761ca1ed56062d9c673eebab911e12fe4626d5d873b755aae0c5f7 WatchSource:0}: Error finding container bd1c3e3f17761ca1ed56062d9c673eebab911e12fe4626d5d873b755aae0c5f7: Status 404 returned error can't find the container with id bd1c3e3f17761ca1ed56062d9c673eebab911e12fe4626d5d873b755aae0c5f7 May 11 20:50:20.038976 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:20.038936 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod037f8bf8_dffb_4ab0_806a_d440b0092789.slice/crio-72859ef01a27c1586798d4be66696780bda4880b156185349e17e7fc64d730b6 WatchSource:0}: Error finding container 72859ef01a27c1586798d4be66696780bda4880b156185349e17e7fc64d730b6: Status 404 returned error can't find the container with id 72859ef01a27c1586798d4be66696780bda4880b156185349e17e7fc64d730b6 May 11 20:50:20.041586 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:20.041558 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c60e645_398a_4781_9df4_1e5322dfe01e.slice/crio-40a7676ec14f1afcc8a83d5233be4d25ddd7a6ea002c97fdcf98deb490dcec6a WatchSource:0}: Error finding container 40a7676ec14f1afcc8a83d5233be4d25ddd7a6ea002c97fdcf98deb490dcec6a: Status 404 returned error can't find the container with id 40a7676ec14f1afcc8a83d5233be4d25ddd7a6ea002c97fdcf98deb490dcec6a May 11 20:50:20.042867 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:20.042630 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd93503a_1025_486b_be75_d191d6bd581d.slice/crio-1c12fd6f51af4a86fa78835ed6741c7ad87f3548671b5632568c5cd6482eea1c WatchSource:0}: Error finding container 1c12fd6f51af4a86fa78835ed6741c7ad87f3548671b5632568c5cd6482eea1c: Status 404 returned error can't find the container with id 1c12fd6f51af4a86fa78835ed6741c7ad87f3548671b5632568c5cd6482eea1c May 11 20:50:20.043936 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:20.043696 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f4142e8_fca7_4d5f_aecf_381a5629861f.slice/crio-f4b841125c22b65a06d1a3a2b07cc93941fb8df7ed1aafc87d81bb8185ffc98c WatchSource:0}: Error finding container f4b841125c22b65a06d1a3a2b07cc93941fb8df7ed1aafc87d81bb8185ffc98c: Status 404 returned error can't find the container with id f4b841125c22b65a06d1a3a2b07cc93941fb8df7ed1aafc87d81bb8185ffc98c May 11 20:50:20.251004 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.250758 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-05-10 20:45:18 +0000 UTC" deadline="2027-10-24 01:05:41.285220638 +0000 UTC" May 11 20:50:20.251004 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.250942 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12724h15m21.034283146s" May 11 20:50:20.317336 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.317300 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p6nhn" event={"ID":"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b","Type":"ContainerStarted","Data":"fa5d68c6d223f939453bb2bce8bf1749f987ac846f57092ebb95a210f63ca764"} May 11 20:50:20.318190 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.318168 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" event={"ID":"9c60e645-398a-4781-9df4-1e5322dfe01e","Type":"ContainerStarted","Data":"40a7676ec14f1afcc8a83d5233be4d25ddd7a6ea002c97fdcf98deb490dcec6a"} May 11 20:50:20.318943 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.318924 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zvbm7" event={"ID":"037f8bf8-dffb-4ab0-806a-d440b0092789","Type":"ContainerStarted","Data":"72859ef01a27c1586798d4be66696780bda4880b156185349e17e7fc64d730b6"} May 11 20:50:20.319768 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.319750 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-grk72" event={"ID":"c19352cd-f3ce-49f5-99aa-571926768a56","Type":"ContainerStarted","Data":"bd1c3e3f17761ca1ed56062d9c673eebab911e12fe4626d5d873b755aae0c5f7"} May 11 20:50:20.320660 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.320643 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" event={"ID":"74173c0d-c7f0-494f-86f5-6992b03a83a1","Type":"ContainerStarted","Data":"2e56b72d47fbdf47dae3c82374953e492e451a017c02536e37977bde6afa5885"} May 11 20:50:20.321984 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.321965 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" event={"ID":"8cacb3f518c3d6c299fcc831d5c18600","Type":"ContainerStarted","Data":"c2f465cbaeb26b6d29f6ec5de4ea7890c5c6cfe001c28f2025837d21ce189ba1"} May 11 20:50:20.323133 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.323114 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal" event={"ID":"c5f0e82f4ac1559ad6c0ea2fd6d8dd2a","Type":"ContainerStarted","Data":"a6140ff38f9a926416e8fff867fe6edfb526072078d4929e5637b6418d89c371"} May 11 20:50:20.323998 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.323977 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4dgqm" event={"ID":"4f4142e8-fca7-4d5f-aecf-381a5629861f","Type":"ContainerStarted","Data":"f4b841125c22b65a06d1a3a2b07cc93941fb8df7ed1aafc87d81bb8185ffc98c"} May 11 20:50:20.327983 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.327960 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" event={"ID":"bd93503a-1025-486b-be75-d191d6bd581d","Type":"ContainerStarted","Data":"1c12fd6f51af4a86fa78835ed6741c7ad87f3548671b5632568c5cd6482eea1c"} May 11 20:50:20.328979 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.328959 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" event={"ID":"96cb7513-d136-4d23-90a5-47ea1604bb7b","Type":"ContainerStarted","Data":"1fcf786d9e3ecacbaf3abc663fb3d3a517171b62fb52df303c3a65cad0b2cdda"} May 11 20:50:20.351258 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.351215 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-190.ec2.internal" podStartSLOduration=2.351203614 podStartE2EDuration="2.351203614s" podCreationTimestamp="2026-05-11 20:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:50:20.351171747 +0000 UTC m=+3.649316469" watchObservedRunningTime="2026-05-11 20:50:20.351203614 +0000 UTC m=+3.649348325" May 11 20:50:20.941477 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.941440 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvs7\" (UniqueName: \"kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7\") pod \"network-check-target-xw5qw\" (UID: \"bae2e16d-3454-4522-88aa-1afafb2e9cb1\") " pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:20.941672 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:20.941511 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:20.941672 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:20.941623 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:20.941777 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:20.941685 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs podName:3be5f296-2151-4f3e-b028-c72728d855da nodeName:}" failed. No retries permitted until 2026-05-11 20:50:22.941666154 +0000 UTC m=+6.239810848 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs") pod "network-metrics-daemon-2ccqq" (UID: "3be5f296-2151-4f3e-b028-c72728d855da") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:20.942158 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:20.942137 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:20.942255 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:20.942163 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:20.942255 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:20.942176 2562 projected.go:194] Error preparing data for projected volume kube-api-access-6tvs7 for pod openshift-network-diagnostics/network-check-target-xw5qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:20.942255 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:20.942222 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7 podName:bae2e16d-3454-4522-88aa-1afafb2e9cb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:22.942206746 +0000 UTC m=+6.240351441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tvs7" (UniqueName: "kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7") pod "network-check-target-xw5qw" (UID: "bae2e16d-3454-4522-88aa-1afafb2e9cb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:21.310867 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:21.310167 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:21.310867 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:21.310286 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:21.311746 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:21.311472 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:21.311746 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:21.311586 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:21.338938 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:21.338696 2562 generic.go:358] "Generic (PLEG): container finished" podID="8cacb3f518c3d6c299fcc831d5c18600" containerID="c2f465cbaeb26b6d29f6ec5de4ea7890c5c6cfe001c28f2025837d21ce189ba1" exitCode=0 May 11 20:50:21.338938 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:21.338770 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" event={"ID":"8cacb3f518c3d6c299fcc831d5c18600","Type":"ContainerDied","Data":"c2f465cbaeb26b6d29f6ec5de4ea7890c5c6cfe001c28f2025837d21ce189ba1"} May 11 20:50:22.347210 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.346512 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" event={"ID":"8cacb3f518c3d6c299fcc831d5c18600","Type":"ContainerStarted","Data":"3d3ae9c5a935d5c89057fe472da49d38f3171a7de2df6bf198bb1c133bd19e23"} May 11 20:50:22.674776 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.673529 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-190.ec2.internal" podStartSLOduration=4.673507764 podStartE2EDuration="4.673507764s" podCreationTimestamp="2026-05-11 20:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:50:22.360989143 +0000 UTC m=+5.659133855" watchObservedRunningTime="2026-05-11 20:50:22.673507764 +0000 UTC m=+5.971652463" May 11 20:50:22.674776 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.674056 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9qsqx"] May 11 20:50:22.676914 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.676889 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:22.677056 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:22.676974 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:22.761463 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.761416 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:22.761630 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.761483 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d85defd8-e86e-4d13-9e13-373afa866baa-kubelet-config\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:22.761630 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.761524 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d85defd8-e86e-4d13-9e13-373afa866baa-dbus\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:22.862464 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.862428 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:22.862623 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.862495 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d85defd8-e86e-4d13-9e13-373afa866baa-kubelet-config\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:22.862623 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.862528 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d85defd8-e86e-4d13-9e13-373afa866baa-dbus\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:22.862623 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:22.862591 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 11 20:50:22.862876 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:22.862672 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret podName:d85defd8-e86e-4d13-9e13-373afa866baa nodeName:}" failed. No retries permitted until 2026-05-11 20:50:23.362652629 +0000 UTC m=+6.660797339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret") pod "global-pull-secret-syncer-9qsqx" (UID: "d85defd8-e86e-4d13-9e13-373afa866baa") : object "kube-system"/"original-pull-secret" not registered May 11 20:50:22.862876 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.862730 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d85defd8-e86e-4d13-9e13-373afa866baa-dbus\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:22.862876 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.862731 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d85defd8-e86e-4d13-9e13-373afa866baa-kubelet-config\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:22.963922 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.963348 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvs7\" (UniqueName: \"kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7\") pod \"network-check-target-xw5qw\" (UID: \"bae2e16d-3454-4522-88aa-1afafb2e9cb1\") " pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:22.963922 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:22.963409 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:22.963922 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:22.963538 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:22.963922 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:22.963542 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:22.963922 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:22.963566 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:22.963922 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:22.963578 2562 projected.go:194] Error preparing data for projected volume kube-api-access-6tvs7 for pod openshift-network-diagnostics/network-check-target-xw5qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:22.963922 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:22.963596 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs podName:3be5f296-2151-4f3e-b028-c72728d855da nodeName:}" failed. No retries permitted until 2026-05-11 20:50:26.963577584 +0000 UTC m=+10.261722309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs") pod "network-metrics-daemon-2ccqq" (UID: "3be5f296-2151-4f3e-b028-c72728d855da") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:22.963922 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:22.963628 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7 podName:bae2e16d-3454-4522-88aa-1afafb2e9cb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:26.963611865 +0000 UTC m=+10.261756562 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tvs7" (UniqueName: "kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7") pod "network-check-target-xw5qw" (UID: "bae2e16d-3454-4522-88aa-1afafb2e9cb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:23.310125 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:23.310045 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:23.310274 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:23.310182 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:23.310274 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:23.310045 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:23.310392 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:23.310352 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:23.366316 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:23.366278 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:23.366895 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:23.366437 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 11 20:50:23.366895 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:23.366513 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret podName:d85defd8-e86e-4d13-9e13-373afa866baa nodeName:}" failed. No retries permitted until 2026-05-11 20:50:24.366492511 +0000 UTC m=+7.664637210 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret") pod "global-pull-secret-syncer-9qsqx" (UID: "d85defd8-e86e-4d13-9e13-373afa866baa") : object "kube-system"/"original-pull-secret" not registered May 11 20:50:24.309386 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:24.309354 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:24.309583 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:24.309486 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:24.374032 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:24.373976 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:24.374496 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:24.374145 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 11 20:50:24.374496 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:24.374204 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret podName:d85defd8-e86e-4d13-9e13-373afa866baa nodeName:}" failed. No retries permitted until 2026-05-11 20:50:26.374186258 +0000 UTC m=+9.672330965 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret") pod "global-pull-secret-syncer-9qsqx" (UID: "d85defd8-e86e-4d13-9e13-373afa866baa") : object "kube-system"/"original-pull-secret" not registered May 11 20:50:25.299646 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.299219 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zhs5x"] May 11 20:50:25.306257 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.306232 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:25.308782 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.308756 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qvmvl\"" May 11 20:50:25.309087 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.309067 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" May 11 20:50:25.309252 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.309235 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" May 11 20:50:25.309553 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.309536 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:25.309669 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:25.309648 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:25.310150 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.310129 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:25.310276 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:25.310257 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:25.381982 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.381941 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d55bb28-de13-44d3-9322-9b22abc5dc03-tmp-dir\") pod \"node-resolver-zhs5x\" (UID: \"9d55bb28-de13-44d3-9322-9b22abc5dc03\") " pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:25.382419 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.382059 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d55bb28-de13-44d3-9322-9b22abc5dc03-hosts-file\") pod \"node-resolver-zhs5x\" (UID: \"9d55bb28-de13-44d3-9322-9b22abc5dc03\") " pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:25.382419 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.382085 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cccrn\" (UniqueName: \"kubernetes.io/projected/9d55bb28-de13-44d3-9322-9b22abc5dc03-kube-api-access-cccrn\") pod \"node-resolver-zhs5x\" (UID: \"9d55bb28-de13-44d3-9322-9b22abc5dc03\") " pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:25.483997 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.483406 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d55bb28-de13-44d3-9322-9b22abc5dc03-hosts-file\") pod \"node-resolver-zhs5x\" (UID: \"9d55bb28-de13-44d3-9322-9b22abc5dc03\") " pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:25.483997 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.483453 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cccrn\" (UniqueName: \"kubernetes.io/projected/9d55bb28-de13-44d3-9322-9b22abc5dc03-kube-api-access-cccrn\") pod \"node-resolver-zhs5x\" (UID: \"9d55bb28-de13-44d3-9322-9b22abc5dc03\") " pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:25.483997 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.483545 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d55bb28-de13-44d3-9322-9b22abc5dc03-tmp-dir\") pod \"node-resolver-zhs5x\" (UID: \"9d55bb28-de13-44d3-9322-9b22abc5dc03\") " pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:25.483997 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.483892 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9d55bb28-de13-44d3-9322-9b22abc5dc03-tmp-dir\") pod \"node-resolver-zhs5x\" (UID: \"9d55bb28-de13-44d3-9322-9b22abc5dc03\") " pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:25.483997 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.483967 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9d55bb28-de13-44d3-9322-9b22abc5dc03-hosts-file\") pod \"node-resolver-zhs5x\" (UID: \"9d55bb28-de13-44d3-9322-9b22abc5dc03\") " pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:25.494319 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.494295 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cccrn\" (UniqueName: \"kubernetes.io/projected/9d55bb28-de13-44d3-9322-9b22abc5dc03-kube-api-access-cccrn\") pod \"node-resolver-zhs5x\" (UID: \"9d55bb28-de13-44d3-9322-9b22abc5dc03\") " pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:25.618474 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:25.618394 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zhs5x" May 11 20:50:26.309331 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:26.309300 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:26.309505 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:26.309429 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:26.391347 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:26.391307 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:26.391821 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:26.391492 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 11 20:50:26.391821 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:26.391554 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret podName:d85defd8-e86e-4d13-9e13-373afa866baa nodeName:}" failed. No retries permitted until 2026-05-11 20:50:30.391535431 +0000 UTC m=+13.689680122 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret") pod "global-pull-secret-syncer-9qsqx" (UID: "d85defd8-e86e-4d13-9e13-373afa866baa") : object "kube-system"/"original-pull-secret" not registered May 11 20:50:26.996982 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:26.996938 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvs7\" (UniqueName: \"kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7\") pod \"network-check-target-xw5qw\" (UID: \"bae2e16d-3454-4522-88aa-1afafb2e9cb1\") " pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:26.997182 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:26.997004 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:26.997182 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:26.997109 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:26.997182 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:26.997137 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:26.997182 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:26.997148 2562 projected.go:194] Error preparing data for projected volume kube-api-access-6tvs7 for pod openshift-network-diagnostics/network-check-target-xw5qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:26.997182 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:26.997154 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:26.997429 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:26.997206 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7 podName:bae2e16d-3454-4522-88aa-1afafb2e9cb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:34.997187496 +0000 UTC m=+18.295332186 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tvs7" (UniqueName: "kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7") pod "network-check-target-xw5qw" (UID: "bae2e16d-3454-4522-88aa-1afafb2e9cb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:26.997429 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:26.997228 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs podName:3be5f296-2151-4f3e-b028-c72728d855da nodeName:}" failed. No retries permitted until 2026-05-11 20:50:34.997216523 +0000 UTC m=+18.295361211 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs") pod "network-metrics-daemon-2ccqq" (UID: "3be5f296-2151-4f3e-b028-c72728d855da") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:27.311032 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:27.310990 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:27.311166 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:27.311112 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:27.311166 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:27.311129 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:27.311286 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:27.311241 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:28.309942 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:28.309892 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:28.310497 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:28.310035 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:29.310235 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:29.310157 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:29.310679 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:29.310283 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:29.310679 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:29.310394 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:29.310679 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:29.310531 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:30.310254 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:30.310223 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:30.310634 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:30.310331 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:30.424778 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:30.424742 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:30.424955 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:30.424877 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 11 20:50:30.424955 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:30.424939 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret podName:d85defd8-e86e-4d13-9e13-373afa866baa nodeName:}" failed. No retries permitted until 2026-05-11 20:50:38.424924227 +0000 UTC m=+21.723068938 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret") pod "global-pull-secret-syncer-9qsqx" (UID: "d85defd8-e86e-4d13-9e13-373afa866baa") : object "kube-system"/"original-pull-secret" not registered May 11 20:50:31.309474 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:31.309433 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:31.309668 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:31.309442 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:31.309668 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:31.309569 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:31.309668 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:31.309650 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:32.309742 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:32.309713 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:32.310146 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:32.309827 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:33.309525 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:33.309488 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:33.309672 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:33.309607 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:33.309672 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:33.309662 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:33.309767 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:33.309751 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:34.309506 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:34.309472 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:34.309760 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:34.309614 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:35.056603 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:35.056569 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvs7\" (UniqueName: \"kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7\") pod \"network-check-target-xw5qw\" (UID: \"bae2e16d-3454-4522-88aa-1afafb2e9cb1\") " pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:35.057041 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:35.056630 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:35.057041 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:35.056744 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:35.057041 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:35.056748 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:35.057041 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:35.056771 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:35.057041 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:35.056786 2562 projected.go:194] Error preparing data for projected volume kube-api-access-6tvs7 for pod openshift-network-diagnostics/network-check-target-xw5qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:35.057041 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:35.056795 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs podName:3be5f296-2151-4f3e-b028-c72728d855da nodeName:}" failed. No retries permitted until 2026-05-11 20:50:51.056780408 +0000 UTC m=+34.354925097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs") pod "network-metrics-daemon-2ccqq" (UID: "3be5f296-2151-4f3e-b028-c72728d855da") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:35.057041 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:35.056835 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7 podName:bae2e16d-3454-4522-88aa-1afafb2e9cb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:51.056818414 +0000 UTC m=+34.354963121 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tvs7" (UniqueName: "kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7") pod "network-check-target-xw5qw" (UID: "bae2e16d-3454-4522-88aa-1afafb2e9cb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:35.309707 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:35.309626 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:35.309707 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:35.309677 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:35.309903 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:35.309754 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:35.309903 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:35.309891 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:36.309360 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:36.309325 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:36.309785 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:36.309428 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:37.310878 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:37.310848 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:37.311324 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:37.310944 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:37.311324 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:37.311043 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:37.311324 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:37.311141 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:37.905589 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:37.905538 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d55bb28_de13_44d3_9322_9b22abc5dc03.slice/crio-b6586df204fb0d9e1e4946664e9d055ada5e9b8139181230d3e0199ffe7cd129 WatchSource:0}: Error finding container b6586df204fb0d9e1e4946664e9d055ada5e9b8139181230d3e0199ffe7cd129: Status 404 returned error can't find the container with id b6586df204fb0d9e1e4946664e9d055ada5e9b8139181230d3e0199ffe7cd129 May 11 20:50:38.310031 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.309853 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:38.310164 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:38.310119 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:38.371024 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.370965 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-grk72" event={"ID":"c19352cd-f3ce-49f5-99aa-571926768a56","Type":"ContainerStarted","Data":"a0529b012d2c7ae3dec22c235b1c1f57302f63f099dfb9977c2036396ac4a87d"} May 11 20:50:38.372352 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.372323 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" event={"ID":"74173c0d-c7f0-494f-86f5-6992b03a83a1","Type":"ContainerStarted","Data":"ae1084ffc65f81a5ae5a5e9159b6f658add10dc884ffcec939583f77806e1b91"} May 11 20:50:38.373725 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.373704 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" event={"ID":"bd93503a-1025-486b-be75-d191d6bd581d","Type":"ContainerStarted","Data":"08673a7fb72675268fe12f88f8d43695d11576cff26a76549d375a921b1ee8c5"} May 11 20:50:38.375120 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.375097 2562 generic.go:358] "Generic (PLEG): container finished" podID="96cb7513-d136-4d23-90a5-47ea1604bb7b" containerID="a603a415b1f0f68a2fee2b18b74862d50afa6200e6ee45c2ed7c54d39d462a93" exitCode=0 May 11 20:50:38.375214 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.375158 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" event={"ID":"96cb7513-d136-4d23-90a5-47ea1604bb7b","Type":"ContainerDied","Data":"a603a415b1f0f68a2fee2b18b74862d50afa6200e6ee45c2ed7c54d39d462a93"} May 11 20:50:38.376371 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.376352 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zhs5x" event={"ID":"9d55bb28-de13-44d3-9322-9b22abc5dc03","Type":"ContainerStarted","Data":"08ad56af915f5b8a9ee56add492eab2bf74dd180a52d0072b57294293ef99f06"} May 11 20:50:38.376441 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.376379 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zhs5x" event={"ID":"9d55bb28-de13-44d3-9322-9b22abc5dc03","Type":"ContainerStarted","Data":"b6586df204fb0d9e1e4946664e9d055ada5e9b8139181230d3e0199ffe7cd129"} May 11 20:50:38.377918 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.377897 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p6nhn" event={"ID":"a64fefbb-edf9-4ffa-adf6-0602e2c7e71b","Type":"ContainerStarted","Data":"f12f9921db944dc120122fba39810be284d2155c1c777485dabb9b965a146ad5"} May 11 20:50:38.379253 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.379235 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zvbm7" event={"ID":"037f8bf8-dffb-4ab0-806a-d440b0092789","Type":"ContainerStarted","Data":"882d9731784a5241283f41d4acffc59942512c06e91898da086a6d8a8284c8aa"} May 11 20:50:38.385196 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.385149 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-grk72" podStartSLOduration=3.450277854 podStartE2EDuration="21.385133866s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:20.039968254 +0000 UTC m=+3.338112944" lastFinishedPulling="2026-05-11 20:50:37.974824248 +0000 UTC m=+21.272968956" observedRunningTime="2026-05-11 20:50:38.384306124 +0000 UTC m=+21.682450847" watchObservedRunningTime="2026-05-11 20:50:38.385133866 +0000 UTC m=+21.683278576" May 11 20:50:38.397854 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.397819 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rlkkm" podStartSLOduration=3.46106088 podStartE2EDuration="21.397807395s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:20.03807823 +0000 UTC m=+3.336222929" lastFinishedPulling="2026-05-11 20:50:37.974824743 +0000 UTC m=+21.272969444" observedRunningTime="2026-05-11 20:50:38.39772448 +0000 UTC m=+21.695869201" watchObservedRunningTime="2026-05-11 20:50:38.397807395 +0000 UTC m=+21.695952105" May 11 20:50:38.425023 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.424968 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-p6nhn" podStartSLOduration=11.727552204 podStartE2EDuration="21.424957959s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:20.047231441 +0000 UTC m=+3.345376144" lastFinishedPulling="2026-05-11 20:50:29.744637207 +0000 UTC m=+13.042781899" observedRunningTime="2026-05-11 20:50:38.424714447 +0000 UTC m=+21.722859157" watchObservedRunningTime="2026-05-11 20:50:38.424957959 +0000 UTC m=+21.723102669" May 11 20:50:38.439500 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.439467 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zhs5x" podStartSLOduration=13.439456348 podStartE2EDuration="13.439456348s" podCreationTimestamp="2026-05-11 20:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:50:38.438838997 +0000 UTC m=+21.736983711" watchObservedRunningTime="2026-05-11 20:50:38.439456348 +0000 UTC m=+21.737601057" May 11 20:50:38.457740 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.457702 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zvbm7" podStartSLOduration=3.514193397 podStartE2EDuration="21.457692092s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:20.041027097 +0000 UTC m=+3.339171787" lastFinishedPulling="2026-05-11 20:50:37.984525794 +0000 UTC m=+21.282670482" observedRunningTime="2026-05-11 20:50:38.457256187 +0000 UTC m=+21.755400896" watchObservedRunningTime="2026-05-11 20:50:38.457692092 +0000 UTC m=+21.755836802" May 11 20:50:38.483858 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.483834 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:38.484082 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:38.484064 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered May 11 20:50:38.484160 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:38.484121 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret podName:d85defd8-e86e-4d13-9e13-373afa866baa nodeName:}" failed. No retries permitted until 2026-05-11 20:50:54.484103557 +0000 UTC m=+37.782248249 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret") pod "global-pull-secret-syncer-9qsqx" (UID: "d85defd8-e86e-4d13-9e13-373afa866baa") : object "kube-system"/"original-pull-secret" not registered May 11 20:50:38.888658 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.888433 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-grk72" May 11 20:50:38.889079 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:38.889061 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-grk72" May 11 20:50:39.310175 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.310132 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:39.310377 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:39.310247 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:39.310377 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.310138 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:39.310468 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:39.310390 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:39.382619 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.382588 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4dgqm" event={"ID":"4f4142e8-fca7-4d5f-aecf-381a5629861f","Type":"ContainerStarted","Data":"cfa86a75fb9d17ffe1bb65a38d5318778c6f30a871f946e462be156810c1cd9d"} May 11 20:50:39.385587 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.385560 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 20:50:39.385929 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.385893 2562 generic.go:358] "Generic (PLEG): container finished" podID="9c60e645-398a-4781-9df4-1e5322dfe01e" containerID="a24396d7793d45d9d6ca898078a3239d37a54c7df58e0eaf0eb8f8da714168c7" exitCode=1 May 11 20:50:39.386081 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.386060 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" event={"ID":"9c60e645-398a-4781-9df4-1e5322dfe01e","Type":"ContainerStarted","Data":"5a719f83344871561a42e26842f1458700868380388b4fcabf9b64a4d22ac418"} May 11 20:50:39.386154 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.386093 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" event={"ID":"9c60e645-398a-4781-9df4-1e5322dfe01e","Type":"ContainerStarted","Data":"7f04e2dbf433b95e0bfab416345611b7af3ce1d38099a9e244a16e03e692671a"} May 11 20:50:39.386154 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.386107 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" event={"ID":"9c60e645-398a-4781-9df4-1e5322dfe01e","Type":"ContainerStarted","Data":"86264a8b94cf389fedb0981bf285db88c3e0e2c4c6ede1b1247ca87b5df2912c"} May 11 20:50:39.386154 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.386119 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" event={"ID":"9c60e645-398a-4781-9df4-1e5322dfe01e","Type":"ContainerStarted","Data":"60a562e7f29584fdbd995d653777dcc776969478ecae781313af6ae8ce7db8dd"} May 11 20:50:39.386154 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.386134 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" event={"ID":"9c60e645-398a-4781-9df4-1e5322dfe01e","Type":"ContainerDied","Data":"a24396d7793d45d9d6ca898078a3239d37a54c7df58e0eaf0eb8f8da714168c7"} May 11 20:50:39.386154 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.386148 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" event={"ID":"9c60e645-398a-4781-9df4-1e5322dfe01e","Type":"ContainerStarted","Data":"7c2e0b7ffa462c5e031c3f3edfcaf11d663748c9124a538df4a41e05eb4b1af4"} May 11 20:50:39.386650 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.386621 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-grk72" May 11 20:50:39.387160 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.387136 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-grk72" May 11 20:50:39.395418 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.395385 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4dgqm" podStartSLOduration=4.469520287 podStartE2EDuration="22.395375358s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:20.046873637 +0000 UTC m=+3.345018337" lastFinishedPulling="2026-05-11 20:50:37.972728714 +0000 UTC m=+21.270873408" observedRunningTime="2026-05-11 20:50:39.395220534 +0000 UTC m=+22.693365268" watchObservedRunningTime="2026-05-11 20:50:39.395375358 +0000 UTC m=+22.693520099" May 11 20:50:39.679991 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:39.679791 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" May 11 20:50:40.282666 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:40.282516 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-05-11T20:50:39.679987898Z","UUID":"8cc1c284-29a1-4ea7-9b38-d37a8801d17b","Handler":null,"Name":"","Endpoint":""} May 11 20:50:40.284724 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:40.284696 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 May 11 20:50:40.284834 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:40.284739 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock May 11 20:50:40.309810 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:40.309783 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:40.309941 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:40.309918 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:40.389821 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:40.389783 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" event={"ID":"bd93503a-1025-486b-be75-d191d6bd581d","Type":"ContainerStarted","Data":"b57a5eccfb321e20438199ba5039fabc7d36b3dc431be47411589c3409c4d312"} May 11 20:50:41.309473 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:41.309418 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:41.309473 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:41.309465 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:41.309736 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:41.309583 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:41.309797 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:41.309742 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:41.395142 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:41.395061 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" event={"ID":"bd93503a-1025-486b-be75-d191d6bd581d","Type":"ContainerStarted","Data":"a11236b645062c94087393b00853066c2e64533d132ccb80a7681965234c1531"} May 11 20:50:41.417758 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:41.417708 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4h9" podStartSLOduration=3.772613598 podStartE2EDuration="24.417696356s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:20.045808082 +0000 UTC m=+3.343952771" lastFinishedPulling="2026-05-11 20:50:40.690890837 +0000 UTC m=+23.989035529" observedRunningTime="2026-05-11 20:50:41.417440409 +0000 UTC m=+24.715585119" watchObservedRunningTime="2026-05-11 20:50:41.417696356 +0000 UTC m=+24.715841090" May 11 20:50:42.309949 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:42.309918 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:42.310154 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:42.310064 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:43.309500 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:43.309465 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:43.310329 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:43.309465 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:43.310329 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:43.309598 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:43.310329 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:43.309637 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:43.400131 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:43.400107 2562 generic.go:358] "Generic (PLEG): container finished" podID="96cb7513-d136-4d23-90a5-47ea1604bb7b" containerID="dc66fa6b03a535759b191cb9d1e5b0dbd46fd3a6a3d69f07dd38f1230d6ada77" exitCode=0 May 11 20:50:43.400272 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:43.400179 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" event={"ID":"96cb7513-d136-4d23-90a5-47ea1604bb7b","Type":"ContainerDied","Data":"dc66fa6b03a535759b191cb9d1e5b0dbd46fd3a6a3d69f07dd38f1230d6ada77"} May 11 20:50:43.402766 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:43.402749 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 20:50:43.403129 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:43.403107 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" event={"ID":"9c60e645-398a-4781-9df4-1e5322dfe01e","Type":"ContainerStarted","Data":"8dcb401c137d356c5df135258545db5fcd0dd5cfe72bc9a5c9ffea5b0ccd401f"} May 11 20:50:44.310099 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:44.310070 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:44.310443 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:44.310161 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:45.309918 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:45.309741 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:45.310165 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:45.309764 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:45.310165 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:45.309982 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:45.310165 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:45.310127 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:45.409143 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:45.409116 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 20:50:45.409552 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:45.409516 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" event={"ID":"9c60e645-398a-4781-9df4-1e5322dfe01e","Type":"ContainerStarted","Data":"c406aee980e4e870cb6caabf8d22888579c250c7cbf29d28409afdcf742ac450"} May 11 20:50:45.409821 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:45.409785 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:45.409984 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:45.409963 2562 scope.go:117] "RemoveContainer" containerID="a24396d7793d45d9d6ca898078a3239d37a54c7df58e0eaf0eb8f8da714168c7" May 11 20:50:45.411493 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:45.411473 2562 generic.go:358] "Generic (PLEG): container finished" podID="96cb7513-d136-4d23-90a5-47ea1604bb7b" containerID="c23aa7eeb2810c823dfac67ef7f7c5e93803e928b1877ac2821e7507d132e65d" exitCode=0 May 11 20:50:45.411600 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:45.411497 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" event={"ID":"96cb7513-d136-4d23-90a5-47ea1604bb7b","Type":"ContainerDied","Data":"c23aa7eeb2810c823dfac67ef7f7c5e93803e928b1877ac2821e7507d132e65d"} May 11 20:50:45.426937 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:45.426920 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:46.310041 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.309952 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:46.310197 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:46.310068 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:46.418143 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.418125 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 20:50:46.418523 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.418499 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" event={"ID":"9c60e645-398a-4781-9df4-1e5322dfe01e","Type":"ContainerStarted","Data":"61ee8be569af3570d0ae61dce3152bd7a075e426a17abaf5ff2eecdb91404b94"} May 11 20:50:46.418847 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.418823 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:46.418952 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.418857 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:46.424564 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.424540 2562 generic.go:358] "Generic (PLEG): container finished" podID="96cb7513-d136-4d23-90a5-47ea1604bb7b" containerID="8a42bb93a13d2790b8eeeb2989be9ba7ff201a04ecfacb0693c967ccc9026fbb" exitCode=0 May 11 20:50:46.424652 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.424575 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" event={"ID":"96cb7513-d136-4d23-90a5-47ea1604bb7b","Type":"ContainerDied","Data":"8a42bb93a13d2790b8eeeb2989be9ba7ff201a04ecfacb0693c967ccc9026fbb"} May 11 20:50:46.436624 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.436604 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:50:46.446817 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.446765 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" podStartSLOduration=11.214781071 podStartE2EDuration="29.446750174s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:20.043711909 +0000 UTC m=+3.341856612" lastFinishedPulling="2026-05-11 20:50:38.275681024 +0000 UTC m=+21.573825715" observedRunningTime="2026-05-11 20:50:46.444989614 +0000 UTC m=+29.743134324" watchObservedRunningTime="2026-05-11 20:50:46.446750174 +0000 UTC m=+29.744894887" May 11 20:50:46.600568 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.600497 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9qsqx"] May 11 20:50:46.600687 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.600609 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:46.600726 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:46.600691 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:46.604076 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.603916 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xw5qw"] May 11 20:50:46.604076 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.604066 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:46.604262 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:46.604160 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:46.604638 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.604613 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2ccqq"] May 11 20:50:46.604728 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:46.604715 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:46.604842 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:46.604811 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:48.310522 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:48.310308 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:48.311146 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:48.310307 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:48.311146 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:48.310601 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:48.311146 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:48.310698 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:48.311146 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:48.310311 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:48.311146 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:48.310814 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:50.309534 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:50.309465 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:50.309534 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:50.309510 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:50.310242 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:50.309615 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xw5qw" podUID="bae2e16d-3454-4522-88aa-1afafb2e9cb1" May 11 20:50:50.310242 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:50.309678 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9qsqx" podUID="d85defd8-e86e-4d13-9e13-373afa866baa" May 11 20:50:50.310242 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:50.309731 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:50.310242 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:50.309834 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2ccqq" podUID="3be5f296-2151-4f3e-b028-c72728d855da" May 11 20:50:50.991786 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:50.991758 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-190.ec2.internal" event="NodeReady" May 11 20:50:50.991954 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:50.991908 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" May 11 20:50:51.026094 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.026062 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj"] May 11 20:50:51.082523 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.082494 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-77758f4558-kfkm4"] May 11 20:50:51.082678 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.082647 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.084790 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.084768 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" May 11 20:50:51.085259 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.085036 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:51.085259 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.085070 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" May 11 20:50:51.085419 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.085253 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvs7\" (UniqueName: \"kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7\") pod \"network-check-target-xw5qw\" (UID: \"bae2e16d-3454-4522-88aa-1afafb2e9cb1\") " pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:51.085419 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.085305 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:51.085558 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.085489 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:51.085614 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.085571 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs podName:3be5f296-2151-4f3e-b028-c72728d855da nodeName:}" failed. No retries permitted until 2026-05-11 20:51:23.085552773 +0000 UTC m=+66.383697466 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs") pod "network-metrics-daemon-2ccqq" (UID: "3be5f296-2151-4f3e-b028-c72728d855da") : object "openshift-multus"/"metrics-daemon-secret" not registered May 11 20:50:51.085677 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.085657 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered May 11 20:50:51.085730 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.085677 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered May 11 20:50:51.085730 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.085690 2562 projected.go:194] Error preparing data for projected volume kube-api-access-6tvs7 for pod openshift-network-diagnostics/network-check-target-xw5qw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:51.085822 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.085744 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7 podName:bae2e16d-3454-4522-88aa-1afafb2e9cb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:23.085729188 +0000 UTC m=+66.383873892 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-6tvs7" (UniqueName: "kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7") pod "network-check-target-xw5qw" (UID: "bae2e16d-3454-4522-88aa-1afafb2e9cb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] May 11 20:50:51.085822 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.085807 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" May 11 20:50:51.085927 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.085492 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-vdml2\"" May 11 20:50:51.103400 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.103376 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw"] May 11 20:50:51.103509 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.103430 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.105942 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.105918 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:51.106082 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.105948 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" May 11 20:50:51.106082 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.105983 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-56xtm\"" May 11 20:50:51.106203 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.106106 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" May 11 20:50:51.106408 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.106381 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" May 11 20:50:51.116125 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.116108 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj"] May 11 20:50:51.116216 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.116135 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp"] May 11 20:50:51.116295 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.116276 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:50:51.118586 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.118567 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bdn2f\"" May 11 20:50:51.118692 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.118594 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" May 11 20:50:51.118692 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.118646 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" May 11 20:50:51.118819 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.118796 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:51.124875 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.124849 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" May 11 20:50:51.134442 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.134375 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-n8sxs"] May 11 20:50:51.134825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.134806 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp" May 11 20:50:51.137130 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.137110 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:51.137130 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.137121 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" May 11 20:50:51.137311 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.137196 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-zzvj7\"" May 11 20:50:51.156702 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.156562 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8bb9d58f7-w68mc"] May 11 20:50:51.156918 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.156900 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.159200 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.159182 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" May 11 20:50:51.159290 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.159256 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wsq67\"" May 11 20:50:51.159290 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.159260 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" May 11 20:50:51.173968 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.173951 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-544c98cc96-j75m5"] May 11 20:50:51.174128 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.174111 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.176607 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.176587 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" May 11 20:50:51.176683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.176588 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" May 11 20:50:51.176683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.176597 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" May 11 20:50:51.176885 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.176873 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6vnrf\"" May 11 20:50:51.181392 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.181374 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" May 11 20:50:51.186472 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.186448 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8xgh\" (UniqueName: \"kubernetes.io/projected/d7e41df8-69e8-4481-9aa4-0456bce8d7df-kube-api-access-r8xgh\") pod \"kube-storage-version-migrator-operator-649b864788-qhtlj\" (UID: \"d7e41df8-69e8-4481-9aa4-0456bce8d7df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.186570 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.186534 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e41df8-69e8-4481-9aa4-0456bce8d7df-config\") pod \"kube-storage-version-migrator-operator-649b864788-qhtlj\" (UID: \"d7e41df8-69e8-4481-9aa4-0456bce8d7df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.186618 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.186585 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e41df8-69e8-4481-9aa4-0456bce8d7df-serving-cert\") pod \"kube-storage-version-migrator-operator-649b864788-qhtlj\" (UID: \"d7e41df8-69e8-4481-9aa4-0456bce8d7df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.196866 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.196846 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7589cfd5f4-pdcvt"] May 11 20:50:51.197002 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.196986 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.199575 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.199557 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" May 11 20:50:51.199703 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.199565 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" May 11 20:50:51.199784 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.199687 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-bwhrj\"" May 11 20:50:51.199847 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.199688 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" May 11 20:50:51.199847 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.199692 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" May 11 20:50:51.204578 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.204562 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" May 11 20:50:51.221372 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.221351 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw"] May 11 20:50:51.221485 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.221380 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-77758f4558-kfkm4"] May 11 20:50:51.221485 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.221392 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp"] May 11 20:50:51.221485 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.221404 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n8sxs"] May 11 20:50:51.221485 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.221419 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-697665887d-5q7f6"] May 11 20:50:51.221710 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.221508 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.224516 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.224497 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" May 11 20:50:51.224634 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.224580 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" May 11 20:50:51.224634 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.224605 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" May 11 20:50:51.224854 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.224837 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-974lw\"" May 11 20:50:51.224905 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.224890 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" May 11 20:50:51.224979 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.224843 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" May 11 20:50:51.225181 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.225167 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" May 11 20:50:51.233492 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.233464 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm"] May 11 20:50:51.233641 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.233626 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:50:51.235780 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.235757 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" May 11 20:50:51.235858 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.235759 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-td7cg\"" May 11 20:50:51.235858 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.235759 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" May 11 20:50:51.245740 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.245691 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx"] May 11 20:50:51.245840 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.245825 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm" May 11 20:50:51.248116 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.248096 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" May 11 20:50:51.248116 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.248110 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" May 11 20:50:51.248258 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.248128 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-89m5q\"" May 11 20:50:51.257744 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.257725 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p92nn"] May 11 20:50:51.257875 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.257861 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:51.260388 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.260305 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" May 11 20:50:51.260388 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.260359 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-82xzn\"" May 11 20:50:51.260541 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.260357 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" May 11 20:50:51.260541 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.260362 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" May 11 20:50:51.260541 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.260494 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" May 11 20:50:51.269726 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.269708 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7"] May 11 20:50:51.269887 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.269869 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:50:51.272099 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.272078 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" May 11 20:50:51.272203 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.272081 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" May 11 20:50:51.272301 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.272284 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qhtb5\"" May 11 20:50:51.272395 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.272308 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" May 11 20:50:51.282175 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.282059 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-544c98cc96-j75m5"] May 11 20:50:51.282175 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.282084 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8bb9d58f7-w68mc"] May 11 20:50:51.282175 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.282098 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7589cfd5f4-pdcvt"] May 11 20:50:51.282175 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.282112 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx"] May 11 20:50:51.282175 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.282123 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7"] May 11 20:50:51.282175 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.282133 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-697665887d-5q7f6"] May 11 20:50:51.282175 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.282145 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm"] May 11 20:50:51.282557 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.282227 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.282616 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.282564 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p92nn"] May 11 20:50:51.284813 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.284794 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" May 11 20:50:51.284921 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.284819 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" May 11 20:50:51.284921 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.284860 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" May 11 20:50:51.284921 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.284805 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" May 11 20:50:51.285086 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.284819 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9vzv9\"" May 11 20:50:51.286920 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.286901 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.287029 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.286936 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-trusted-ca\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.287029 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.286960 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k668b\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-kube-api-access-k668b\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.287128 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287035 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-config-volume\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.287128 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287066 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-image-registry-private-configuration\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.287128 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287117 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e41df8-69e8-4481-9aa4-0456bce8d7df-serving-cert\") pod \"kube-storage-version-migrator-operator-649b864788-qhtlj\" (UID: \"d7e41df8-69e8-4481-9aa4-0456bce8d7df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.287262 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287143 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hjwg\" (UniqueName: \"kubernetes.io/projected/c0e49337-8925-47a4-9b6a-95c7bd4e9887-kube-api-access-4hjwg\") pod \"volume-data-source-validator-6648d555c9-kgwqp\" (UID: \"c0e49337-8925-47a4-9b6a-95c7bd4e9887\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp" May 11 20:50:51.287262 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287170 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-bound-sa-token\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.287262 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287214 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snn94\" (UniqueName: \"kubernetes.io/projected/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-kube-api-access-snn94\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:50:51.287262 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287249 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dzx8\" (UniqueName: \"kubernetes.io/projected/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-kube-api-access-8dzx8\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.287453 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287293 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54fd2ae3-b688-40a5-b542-77c19799ef8a-ca-trust-extracted\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.287453 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287346 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-certificates\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.287453 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287394 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xgh\" (UniqueName: \"kubernetes.io/projected/d7e41df8-69e8-4481-9aa4-0456bce8d7df-kube-api-access-r8xgh\") pod \"kube-storage-version-migrator-operator-649b864788-qhtlj\" (UID: \"d7e41df8-69e8-4481-9aa4-0456bce8d7df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.287590 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287469 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffdd990-0f6a-4e43-a62a-94c91746d6fc-serving-cert\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.287590 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287501 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:50:51.287590 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287521 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-installation-pull-secrets\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.287590 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287543 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bffdd990-0f6a-4e43-a62a-94c91746d6fc-trusted-ca\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.287773 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287595 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.287773 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287652 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffdd990-0f6a-4e43-a62a-94c91746d6fc-config\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.287773 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287667 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6z2f\" (UniqueName: \"kubernetes.io/projected/bffdd990-0f6a-4e43-a62a-94c91746d6fc-kube-api-access-p6z2f\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.287773 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287696 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e41df8-69e8-4481-9aa4-0456bce8d7df-config\") pod \"kube-storage-version-migrator-operator-649b864788-qhtlj\" (UID: \"d7e41df8-69e8-4481-9aa4-0456bce8d7df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.287773 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.287719 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-tmp-dir\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.288235 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.288214 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e41df8-69e8-4481-9aa4-0456bce8d7df-config\") pod \"kube-storage-version-migrator-operator-649b864788-qhtlj\" (UID: \"d7e41df8-69e8-4481-9aa4-0456bce8d7df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.291468 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.291447 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7e41df8-69e8-4481-9aa4-0456bce8d7df-serving-cert\") pod \"kube-storage-version-migrator-operator-649b864788-qhtlj\" (UID: \"d7e41df8-69e8-4481-9aa4-0456bce8d7df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.295084 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.294972 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8xgh\" (UniqueName: \"kubernetes.io/projected/d7e41df8-69e8-4481-9aa4-0456bce8d7df-kube-api-access-r8xgh\") pod \"kube-storage-version-migrator-operator-649b864788-qhtlj\" (UID: \"d7e41df8-69e8-4481-9aa4-0456bce8d7df\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.388360 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388330 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388367 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388393 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/17f13489-a9eb-4f66-85c8-6967aa3ec01a-nginx-conf\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388421 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0311189e-d497-4d2c-a742-ad52f624750a-snapshots\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.388440 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388471 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-telemetry-config\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388503 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62n4h\" (UniqueName: \"kubernetes.io/projected/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-kube-api-access-62n4h\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.388531 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls podName:7ff2f12c-70ce-4a2c-8828-4562a60dc95d nodeName:}" failed. No retries permitted until 2026-05-11 20:50:51.888497079 +0000 UTC m=+35.186641775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls") pod "dns-default-n8sxs" (UID: "7ff2f12c-70ce-4a2c-8828-4562a60dc95d") : secret "dns-default-metrics-tls" not found May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388602 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311189e-d497-4d2c-a742-ad52f624750a-service-ca-bundle\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388632 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-bound-sa-token\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388657 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64xn7\" (UniqueName: \"kubernetes.io/projected/0311189e-d497-4d2c-a742-ad52f624750a-kube-api-access-64xn7\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388684 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dzx8\" (UniqueName: \"kubernetes.io/projected/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-kube-api-access-8dzx8\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388707 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54fd2ae3-b688-40a5-b542-77c19799ef8a-ca-trust-extracted\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388737 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffdd990-0f6a-4e43-a62a-94c91746d6fc-serving-cert\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388763 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba5697e4-10a6-472f-b7fc-5b4568baf16b-config\") pod \"service-ca-operator-686cb587d-jshx7\" (UID: \"ba5697e4-10a6-472f-b7fc-5b4568baf16b\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.388901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388793 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388816 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m9sf\" (UniqueName: \"kubernetes.io/projected/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-kube-api-access-4m9sf\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388844 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcmgc\" (UniqueName: \"kubernetes.io/projected/ba5697e4-10a6-472f-b7fc-5b4568baf16b-kube-api-access-zcmgc\") pod \"service-ca-operator-686cb587d-jshx7\" (UID: \"ba5697e4-10a6-472f-b7fc-5b4568baf16b\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388884 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffdd990-0f6a-4e43-a62a-94c91746d6fc-config\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388913 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6z2f\" (UniqueName: \"kubernetes.io/projected/bffdd990-0f6a-4e43-a62a-94c91746d6fc-kube-api-access-p6z2f\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388936 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-tmp-dir\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.388960 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388968 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba5697e4-10a6-472f-b7fc-5b4568baf16b-serving-cert\") pod \"service-ca-operator-686cb587d-jshx7\" (UID: \"ba5697e4-10a6-472f-b7fc-5b4568baf16b\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.388994 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311189e-d497-4d2c-a742-ad52f624750a-trusted-ca-bundle\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389057 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-stats-auth\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.388972 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bb9d58f7-w68mc: secret "image-registry-tls" not found May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.389143 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls podName:54fd2ae3-b688-40a5-b542-77c19799ef8a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:51.889122322 +0000 UTC m=+35.187267010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls") pod "image-registry-8bb9d58f7-w68mc" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a") : secret "image-registry-tls" not found May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389196 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54fd2ae3-b688-40a5-b542-77c19799ef8a-ca-trust-extracted\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389250 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g74fl\" (UniqueName: \"kubernetes.io/projected/ea8d418d-c309-4692-8be3-e3a7eeb22225-kube-api-access-g74fl\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389291 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-trusted-ca\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389317 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k668b\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-kube-api-access-k668b\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.389627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389345 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389379 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-default-certificate\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389402 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0311189e-d497-4d2c-a742-ad52f624750a-serving-cert\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389406 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-tmp-dir\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389432 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-config-volume\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389460 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-image-registry-private-configuration\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389498 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bffdd990-0f6a-4e43-a62a-94c91746d6fc-trusted-ca\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389551 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hjwg\" (UniqueName: \"kubernetes.io/projected/c0e49337-8925-47a4-9b6a-95c7bd4e9887-kube-api-access-4hjwg\") pod \"volume-data-source-validator-6648d555c9-kgwqp\" (UID: \"c0e49337-8925-47a4-9b6a-95c7bd4e9887\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389577 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0311189e-d497-4d2c-a742-ad52f624750a-tmp\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389602 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq5js\" (UniqueName: \"kubernetes.io/projected/aa10b731-c86e-4be7-b230-1ef6c613b38f-kube-api-access-kq5js\") pod \"network-check-source-6859b67c86-hzqqm\" (UID: \"aa10b731-c86e-4be7-b230-1ef6c613b38f\") " pod="openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389635 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snn94\" (UniqueName: \"kubernetes.io/projected/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-kube-api-access-snn94\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389664 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-certificates\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389701 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389729 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-installation-pull-secrets\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389763 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389794 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.390499 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389818 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bffdd990-0f6a-4e43-a62a-94c91746d6fc-config\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.391170 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389842 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.391170 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.389899 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-config-volume\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.391170 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.389985 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 11 20:50:51.391170 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.390060 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls podName:98bf1c69-fee9-4051-8fbc-c2fffe394f8b nodeName:}" failed. No retries permitted until 2026-05-11 20:50:51.890045102 +0000 UTC m=+35.188189792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-tghxw" (UID: "98bf1c69-fee9-4051-8fbc-c2fffe394f8b") : secret "samples-operator-tls" not found May 11 20:50:51.391170 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.390278 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-trusted-ca\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.391170 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.390356 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-certificates\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.391170 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.390717 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bffdd990-0f6a-4e43-a62a-94c91746d6fc-trusted-ca\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.392061 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.391837 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bffdd990-0f6a-4e43-a62a-94c91746d6fc-serving-cert\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.392385 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.392369 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-image-registry-private-configuration\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.392513 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.392491 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-installation-pull-secrets\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.393296 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.393271 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" May 11 20:50:51.400751 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.400728 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hjwg\" (UniqueName: \"kubernetes.io/projected/c0e49337-8925-47a4-9b6a-95c7bd4e9887-kube-api-access-4hjwg\") pod \"volume-data-source-validator-6648d555c9-kgwqp\" (UID: \"c0e49337-8925-47a4-9b6a-95c7bd4e9887\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp" May 11 20:50:51.401435 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.401126 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-bound-sa-token\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.401630 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.401610 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snn94\" (UniqueName: \"kubernetes.io/projected/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-kube-api-access-snn94\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:50:51.401754 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.401730 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k668b\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-kube-api-access-k668b\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.402362 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.402330 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dzx8\" (UniqueName: \"kubernetes.io/projected/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-kube-api-access-8dzx8\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.402899 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.402877 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6z2f\" (UniqueName: \"kubernetes.io/projected/bffdd990-0f6a-4e43-a62a-94c91746d6fc-kube-api-access-p6z2f\") pod \"console-operator-77758f4558-kfkm4\" (UID: \"bffdd990-0f6a-4e43-a62a-94c91746d6fc\") " pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.413229 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.413183 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:50:51.445236 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.445216 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp" May 11 20:50:51.490293 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490271 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-default-certificate\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.490417 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490303 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0311189e-d497-4d2c-a742-ad52f624750a-serving-cert\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.490417 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490325 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0311189e-d497-4d2c-a742-ad52f624750a-tmp\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.490417 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490343 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq5js\" (UniqueName: \"kubernetes.io/projected/aa10b731-c86e-4be7-b230-1ef6c613b38f-kube-api-access-kq5js\") pod \"network-check-source-6859b67c86-hzqqm\" (UID: \"aa10b731-c86e-4be7-b230-1ef6c613b38f\") " pod="openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm" May 11 20:50:51.490417 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490380 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:50:51.490417 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490408 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.490656 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.490476 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:50:51.490656 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.490490 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 11 20:50:51.490656 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490487 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.490656 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.490535 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert podName:9d81ee0b-b7dc-45a9-bc60-e7389a88feb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:51.990515473 +0000 UTC m=+35.288660164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert") pod "ingress-canary-p92nn" (UID: "9d81ee0b-b7dc-45a9-bc60-e7389a88feb1") : secret "canary-serving-cert" not found May 11 20:50:51.490656 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.490577 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:51.990560807 +0000 UTC m=+35.288705501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : secret "router-metrics-certs-default" not found May 11 20:50:51.490656 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.490600 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:51.990590971 +0000 UTC m=+35.288735691 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : configmap references non-existent config key: service-ca.crt May 11 20:50:51.490656 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490641 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490670 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/17f13489-a9eb-4f66-85c8-6967aa3ec01a-nginx-conf\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490700 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0311189e-d497-4d2c-a742-ad52f624750a-snapshots\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.490744 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490755 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-telemetry-config\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.490792 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls podName:a8dd435c-a454-4ae4-935a-67c1f9c9ec81 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:51.990777957 +0000 UTC m=+35.288922660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-gldtx" (UID: "a8dd435c-a454-4ae4-935a-67c1f9c9ec81") : secret "cluster-monitoring-operator-tls" not found May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490815 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62n4h\" (UniqueName: \"kubernetes.io/projected/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-kube-api-access-62n4h\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490838 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0311189e-d497-4d2c-a742-ad52f624750a-tmp\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490856 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311189e-d497-4d2c-a742-ad52f624750a-service-ca-bundle\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490890 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-64xn7\" (UniqueName: \"kubernetes.io/projected/0311189e-d497-4d2c-a742-ad52f624750a-kube-api-access-64xn7\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490927 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba5697e4-10a6-472f-b7fc-5b4568baf16b-config\") pod \"service-ca-operator-686cb587d-jshx7\" (UID: \"ba5697e4-10a6-472f-b7fc-5b4568baf16b\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490967 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m9sf\" (UniqueName: \"kubernetes.io/projected/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-kube-api-access-4m9sf\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.490992 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcmgc\" (UniqueName: \"kubernetes.io/projected/ba5697e4-10a6-472f-b7fc-5b4568baf16b-kube-api-access-zcmgc\") pod \"service-ca-operator-686cb587d-jshx7\" (UID: \"ba5697e4-10a6-472f-b7fc-5b4568baf16b\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.491072 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba5697e4-10a6-472f-b7fc-5b4568baf16b-serving-cert\") pod \"service-ca-operator-686cb587d-jshx7\" (UID: \"ba5697e4-10a6-472f-b7fc-5b4568baf16b\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.491100 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311189e-d497-4d2c-a742-ad52f624750a-trusted-ca-bundle\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.491136 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-stats-auth\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.491156 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.491163 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g74fl\" (UniqueName: \"kubernetes.io/projected/ea8d418d-c309-4692-8be3-e3a7eeb22225-kube-api-access-g74fl\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.491985 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.491190 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:50:51.491985 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.491291 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 11 20:50:51.491985 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.491326 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert podName:17f13489-a9eb-4f66-85c8-6967aa3ec01a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:51.991315599 +0000 UTC m=+35.289460288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-5q7f6" (UID: "17f13489-a9eb-4f66-85c8-6967aa3ec01a") : secret "networking-console-plugin-cert" not found May 11 20:50:51.491985 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.491324 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0311189e-d497-4d2c-a742-ad52f624750a-snapshots\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.491985 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.491493 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311189e-d497-4d2c-a742-ad52f624750a-service-ca-bundle\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.491985 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.491517 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/17f13489-a9eb-4f66-85c8-6967aa3ec01a-nginx-conf\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:50:51.491985 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.491597 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba5697e4-10a6-472f-b7fc-5b4568baf16b-config\") pod \"service-ca-operator-686cb587d-jshx7\" (UID: \"ba5697e4-10a6-472f-b7fc-5b4568baf16b\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.491985 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.491603 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-telemetry-config\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:51.492398 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.492170 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0311189e-d497-4d2c-a742-ad52f624750a-trusted-ca-bundle\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.493282 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.493259 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0311189e-d497-4d2c-a742-ad52f624750a-serving-cert\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.494111 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.494091 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-default-certificate\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.494210 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.494170 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-stats-auth\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.494265 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.494204 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba5697e4-10a6-472f-b7fc-5b4568baf16b-serving-cert\") pod \"service-ca-operator-686cb587d-jshx7\" (UID: \"ba5697e4-10a6-472f-b7fc-5b4568baf16b\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.498979 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.498925 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62n4h\" (UniqueName: \"kubernetes.io/projected/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-kube-api-access-62n4h\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:50:51.499283 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.499241 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq5js\" (UniqueName: \"kubernetes.io/projected/aa10b731-c86e-4be7-b230-1ef6c613b38f-kube-api-access-kq5js\") pod \"network-check-source-6859b67c86-hzqqm\" (UID: \"aa10b731-c86e-4be7-b230-1ef6c613b38f\") " pod="openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm" May 11 20:50:51.500595 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.500556 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-64xn7\" (UniqueName: \"kubernetes.io/projected/0311189e-d497-4d2c-a742-ad52f624750a-kube-api-access-64xn7\") pod \"insights-operator-544c98cc96-j75m5\" (UID: \"0311189e-d497-4d2c-a742-ad52f624750a\") " pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.500733 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.500712 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g74fl\" (UniqueName: \"kubernetes.io/projected/ea8d418d-c309-4692-8be3-e3a7eeb22225-kube-api-access-g74fl\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.500936 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.500917 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcmgc\" (UniqueName: \"kubernetes.io/projected/ba5697e4-10a6-472f-b7fc-5b4568baf16b-kube-api-access-zcmgc\") pod \"service-ca-operator-686cb587d-jshx7\" (UID: \"ba5697e4-10a6-472f-b7fc-5b4568baf16b\") " pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.500936 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.500930 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m9sf\" (UniqueName: \"kubernetes.io/projected/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-kube-api-access-4m9sf\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:51.507197 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.507176 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-544c98cc96-j75m5" May 11 20:50:51.558085 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.558059 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm" May 11 20:50:51.591846 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.591825 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" May 11 20:50:51.894535 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.894459 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:50:51.894752 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.894553 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:51.894752 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.894600 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 11 20:50:51.894752 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.894622 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:51.894752 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.894658 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls podName:98bf1c69-fee9-4051-8fbc-c2fffe394f8b nodeName:}" failed. No retries permitted until 2026-05-11 20:50:52.894643134 +0000 UTC m=+36.192787822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-tghxw" (UID: "98bf1c69-fee9-4051-8fbc-c2fffe394f8b") : secret "samples-operator-tls" not found May 11 20:50:51.894752 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.894687 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:50:51.894752 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.894711 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:50:51.894752 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.894735 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bb9d58f7-w68mc: secret "image-registry-tls" not found May 11 20:50:51.894752 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.894752 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls podName:7ff2f12c-70ce-4a2c-8828-4562a60dc95d nodeName:}" failed. No retries permitted until 2026-05-11 20:50:52.89473784 +0000 UTC m=+36.192882529 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls") pod "dns-default-n8sxs" (UID: "7ff2f12c-70ce-4a2c-8828-4562a60dc95d") : secret "dns-default-metrics-tls" not found May 11 20:50:51.895042 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.894780 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls podName:54fd2ae3-b688-40a5-b542-77c19799ef8a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:52.894768097 +0000 UTC m=+36.192912793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls") pod "image-registry-8bb9d58f7-w68mc" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a") : secret "image-registry-tls" not found May 11 20:50:51.995565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.995327 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:50:51.995896 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.995631 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:50:51.995896 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.995662 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.995896 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.995703 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:51.995896 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:51.995742 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:51.996174 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.995481 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 11 20:50:51.996174 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.995968 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert podName:17f13489-a9eb-4f66-85c8-6967aa3ec01a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:52.995946784 +0000 UTC m=+36.294091492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-5q7f6" (UID: "17f13489-a9eb-4f66-85c8-6967aa3ec01a") : secret "networking-console-plugin-cert" not found May 11 20:50:51.996174 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.996052 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:50:51.996174 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.996090 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert podName:9d81ee0b-b7dc-45a9-bc60-e7389a88feb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:52.996077014 +0000 UTC m=+36.294221710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert") pod "ingress-canary-p92nn" (UID: "9d81ee0b-b7dc-45a9-bc60-e7389a88feb1") : secret "canary-serving-cert" not found May 11 20:50:51.996174 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.996161 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 11 20:50:51.996416 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.996193 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:52.996182866 +0000 UTC m=+36.294327554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : secret "router-metrics-certs-default" not found May 11 20:50:51.996416 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.996262 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:52.996252667 +0000 UTC m=+36.294397357 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : configmap references non-existent config key: service-ca.crt May 11 20:50:51.996416 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.995899 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 11 20:50:51.996416 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:51.996304 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls podName:a8dd435c-a454-4ae4-935a-67c1f9c9ec81 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:52.996292872 +0000 UTC m=+36.294437564 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-gldtx" (UID: "a8dd435c-a454-4ae4-935a-67c1f9c9ec81") : secret "cluster-monitoring-operator-tls" not found May 11 20:50:52.168183 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.168069 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7"] May 11 20:50:52.171312 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.171277 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-544c98cc96-j75m5"] May 11 20:50:52.180192 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.180152 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-77758f4558-kfkm4"] May 11 20:50:52.183449 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.183428 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp"] May 11 20:50:52.186983 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.186032 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj"] May 11 20:50:52.198619 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.198595 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm"] May 11 20:50:52.226137 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:52.226109 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba5697e4_10a6_472f_b7fc_5b4568baf16b.slice/crio-514cbca5d08f154b08307c2e8b3f829b47a0ace932f7899193a0e40919a751f0 WatchSource:0}: Error finding container 514cbca5d08f154b08307c2e8b3f829b47a0ace932f7899193a0e40919a751f0: Status 404 returned error can't find the container with id 514cbca5d08f154b08307c2e8b3f829b47a0ace932f7899193a0e40919a751f0 May 11 20:50:52.226534 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:52.226507 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0311189e_d497_4d2c_a742_ad52f624750a.slice/crio-6c1d18693af72632220a0fb8acfb367ab0112203efd7dfb841148ffa43fe8b74 WatchSource:0}: Error finding container 6c1d18693af72632220a0fb8acfb367ab0112203efd7dfb841148ffa43fe8b74: Status 404 returned error can't find the container with id 6c1d18693af72632220a0fb8acfb367ab0112203efd7dfb841148ffa43fe8b74 May 11 20:50:52.227340 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:52.227317 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbffdd990_0f6a_4e43_a62a_94c91746d6fc.slice/crio-37bbd65e577b6dd2abbcfd6af8e57770eee3bb9909a43402982f0634c098764d WatchSource:0}: Error finding container 37bbd65e577b6dd2abbcfd6af8e57770eee3bb9909a43402982f0634c098764d: Status 404 returned error can't find the container with id 37bbd65e577b6dd2abbcfd6af8e57770eee3bb9909a43402982f0634c098764d May 11 20:50:52.228705 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:52.228540 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7e41df8_69e8_4481_9aa4_0456bce8d7df.slice/crio-2cd6e2a120dd1f14ed6be17bbbdaca92675039acab30fd59fa28ea2ec4519421 WatchSource:0}: Error finding container 2cd6e2a120dd1f14ed6be17bbbdaca92675039acab30fd59fa28ea2ec4519421: Status 404 returned error can't find the container with id 2cd6e2a120dd1f14ed6be17bbbdaca92675039acab30fd59fa28ea2ec4519421 May 11 20:50:52.229809 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:52.229788 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0e49337_8925_47a4_9b6a_95c7bd4e9887.slice/crio-937a0e253a67afb875a8c803d4c6e9c669e4d48183da18615f75b7d254a581e4 WatchSource:0}: Error finding container 937a0e253a67afb875a8c803d4c6e9c669e4d48183da18615f75b7d254a581e4: Status 404 returned error can't find the container with id 937a0e253a67afb875a8c803d4c6e9c669e4d48183da18615f75b7d254a581e4 May 11 20:50:52.230023 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:52.229987 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa10b731_c86e_4be7_b230_1ef6c613b38f.slice/crio-e0b31835e12cdf5b24380d3e895b8455fd200f9f2bf6e4001346dd5f68bce2e8 WatchSource:0}: Error finding container e0b31835e12cdf5b24380d3e895b8455fd200f9f2bf6e4001346dd5f68bce2e8: Status 404 returned error can't find the container with id e0b31835e12cdf5b24380d3e895b8455fd200f9f2bf6e4001346dd5f68bce2e8 May 11 20:50:52.309847 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.309704 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:50:52.309994 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.309966 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:50:52.310070 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.310023 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:52.312134 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.312112 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" May 11 20:50:52.312406 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.312385 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mdvls\"" May 11 20:50:52.312406 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.312407 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kqkzb\"" May 11 20:50:52.312523 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.312395 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" May 11 20:50:52.437842 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.437811 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp" event={"ID":"c0e49337-8925-47a4-9b6a-95c7bd4e9887","Type":"ContainerStarted","Data":"937a0e253a67afb875a8c803d4c6e9c669e4d48183da18615f75b7d254a581e4"} May 11 20:50:52.438769 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.438745 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" event={"ID":"ba5697e4-10a6-472f-b7fc-5b4568baf16b","Type":"ContainerStarted","Data":"514cbca5d08f154b08307c2e8b3f829b47a0ace932f7899193a0e40919a751f0"} May 11 20:50:52.439664 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.439631 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-544c98cc96-j75m5" event={"ID":"0311189e-d497-4d2c-a742-ad52f624750a","Type":"ContainerStarted","Data":"6c1d18693af72632220a0fb8acfb367ab0112203efd7dfb841148ffa43fe8b74"} May 11 20:50:52.440548 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.440523 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" event={"ID":"bffdd990-0f6a-4e43-a62a-94c91746d6fc","Type":"ContainerStarted","Data":"37bbd65e577b6dd2abbcfd6af8e57770eee3bb9909a43402982f0634c098764d"} May 11 20:50:52.441495 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.441466 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm" event={"ID":"aa10b731-c86e-4be7-b230-1ef6c613b38f","Type":"ContainerStarted","Data":"e0b31835e12cdf5b24380d3e895b8455fd200f9f2bf6e4001346dd5f68bce2e8"} May 11 20:50:52.442362 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.442343 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" event={"ID":"d7e41df8-69e8-4481-9aa4-0456bce8d7df","Type":"ContainerStarted","Data":"2cd6e2a120dd1f14ed6be17bbbdaca92675039acab30fd59fa28ea2ec4519421"} May 11 20:50:52.444433 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.444413 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" event={"ID":"96cb7513-d136-4d23-90a5-47ea1604bb7b","Type":"ContainerStarted","Data":"42c37ad432ac0fc33e9bccdb053141b62d4a5f2d9e70374ad7fabcb26845c15d"} May 11 20:50:52.904574 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.904534 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:50:52.904829 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.904637 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:52.904829 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:52.904700 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:52.904829 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:52.904810 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 11 20:50:52.905079 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:52.904821 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:50:52.905079 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:52.904813 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:50:52.905079 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:52.904881 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bb9d58f7-w68mc: secret "image-registry-tls" not found May 11 20:50:52.905079 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:52.904882 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls podName:98bf1c69-fee9-4051-8fbc-c2fffe394f8b nodeName:}" failed. No retries permitted until 2026-05-11 20:50:54.90486381 +0000 UTC m=+38.203008511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-tghxw" (UID: "98bf1c69-fee9-4051-8fbc-c2fffe394f8b") : secret "samples-operator-tls" not found May 11 20:50:52.905079 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:52.904925 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls podName:7ff2f12c-70ce-4a2c-8828-4562a60dc95d nodeName:}" failed. No retries permitted until 2026-05-11 20:50:54.904913073 +0000 UTC m=+38.203057762 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls") pod "dns-default-n8sxs" (UID: "7ff2f12c-70ce-4a2c-8828-4562a60dc95d") : secret "dns-default-metrics-tls" not found May 11 20:50:52.905079 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:52.904939 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls podName:54fd2ae3-b688-40a5-b542-77c19799ef8a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:54.904930951 +0000 UTC m=+38.203075643 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls") pod "image-registry-8bb9d58f7-w68mc" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a") : secret "image-registry-tls" not found May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:53.005772 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:53.005834 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:53.005947 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:53.006020 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:53.006053 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:53.006175 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:53.006233 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:55.006215138 +0000 UTC m=+38.304359841 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : secret "router-metrics-certs-default" not found May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:53.006507 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:53.006562 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert podName:17f13489-a9eb-4f66-85c8-6967aa3ec01a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:55.006545384 +0000 UTC m=+38.304690089 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-5q7f6" (UID: "17f13489-a9eb-4f66-85c8-6967aa3ec01a") : secret "networking-console-plugin-cert" not found May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:53.006598 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:53.006628 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:55.006618788 +0000 UTC m=+38.304763480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : configmap references non-existent config key: service-ca.crt May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:53.006645 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert podName:9d81ee0b-b7dc-45a9-bc60-e7389a88feb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:55.006636464 +0000 UTC m=+38.304781160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert") pod "ingress-canary-p92nn" (UID: "9d81ee0b-b7dc-45a9-bc60-e7389a88feb1") : secret "canary-serving-cert" not found May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:53.006674 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 11 20:50:53.006813 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:53.006713 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls podName:a8dd435c-a454-4ae4-935a-67c1f9c9ec81 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:55.006701984 +0000 UTC m=+38.304846681 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-gldtx" (UID: "a8dd435c-a454-4ae4-935a-67c1f9c9ec81") : secret "cluster-monitoring-operator-tls" not found May 11 20:50:53.456058 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:53.455246 2562 generic.go:358] "Generic (PLEG): container finished" podID="96cb7513-d136-4d23-90a5-47ea1604bb7b" containerID="42c37ad432ac0fc33e9bccdb053141b62d4a5f2d9e70374ad7fabcb26845c15d" exitCode=0 May 11 20:50:53.456058 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:53.455310 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" event={"ID":"96cb7513-d136-4d23-90a5-47ea1604bb7b","Type":"ContainerDied","Data":"42c37ad432ac0fc33e9bccdb053141b62d4a5f2d9e70374ad7fabcb26845c15d"} May 11 20:50:54.521407 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:54.521354 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:54.525959 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:54.525906 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d85defd8-e86e-4d13-9e13-373afa866baa-original-pull-secret\") pod \"global-pull-secret-syncer-9qsqx\" (UID: \"d85defd8-e86e-4d13-9e13-373afa866baa\") " pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:54.766616 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:54.766584 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9qsqx" May 11 20:50:54.925160 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:54.925079 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:54.925323 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:54.925202 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:50:54.925323 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:54.925238 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:50:54.925323 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:54.925257 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bb9d58f7-w68mc: secret "image-registry-tls" not found May 11 20:50:54.925323 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:54.925305 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:54.925323 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:54.925316 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls podName:54fd2ae3-b688-40a5-b542-77c19799ef8a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:58.925292941 +0000 UTC m=+42.223437644 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls") pod "image-registry-8bb9d58f7-w68mc" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a") : secret "image-registry-tls" not found May 11 20:50:54.925604 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:54.925419 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 11 20:50:54.925604 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:54.925455 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:50:54.925604 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:54.925489 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls podName:98bf1c69-fee9-4051-8fbc-c2fffe394f8b nodeName:}" failed. No retries permitted until 2026-05-11 20:50:58.925465893 +0000 UTC m=+42.223610599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-tghxw" (UID: "98bf1c69-fee9-4051-8fbc-c2fffe394f8b") : secret "samples-operator-tls" not found May 11 20:50:54.925604 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:54.925541 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls podName:7ff2f12c-70ce-4a2c-8828-4562a60dc95d nodeName:}" failed. No retries permitted until 2026-05-11 20:50:58.925496704 +0000 UTC m=+42.223641433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls") pod "dns-default-n8sxs" (UID: "7ff2f12c-70ce-4a2c-8828-4562a60dc95d") : secret "dns-default-metrics-tls" not found May 11 20:50:55.026474 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:55.026430 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:50:55.026734 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:55.026508 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:50:55.026734 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:55.026543 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:55.026734 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:55.026584 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 11 20:50:55.026734 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:55.026659 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert podName:17f13489-a9eb-4f66-85c8-6967aa3ec01a nodeName:}" failed. No retries permitted until 2026-05-11 20:50:59.026635439 +0000 UTC m=+42.324780144 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-5q7f6" (UID: "17f13489-a9eb-4f66-85c8-6967aa3ec01a") : secret "networking-console-plugin-cert" not found May 11 20:50:55.026734 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:55.026710 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:59.026680915 +0000 UTC m=+42.324825618 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : configmap references non-existent config key: service-ca.crt May 11 20:50:55.027023 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:55.026802 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:50:55.027023 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:55.026587 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:55.027023 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:55.026860 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert podName:9d81ee0b-b7dc-45a9-bc60-e7389a88feb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:59.026826607 +0000 UTC m=+42.324971299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert") pod "ingress-canary-p92nn" (UID: "9d81ee0b-b7dc-45a9-bc60-e7389a88feb1") : secret "canary-serving-cert" not found May 11 20:50:55.027023 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:55.026871 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 11 20:50:55.027023 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:55.026920 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:59.026906653 +0000 UTC m=+42.325051357 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : secret "router-metrics-certs-default" not found May 11 20:50:55.027023 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:55.026956 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:55.027282 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:55.027052 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 11 20:50:55.027282 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:55.027095 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls podName:a8dd435c-a454-4ae4-935a-67c1f9c9ec81 nodeName:}" failed. No retries permitted until 2026-05-11 20:50:59.027083969 +0000 UTC m=+42.325228665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-gldtx" (UID: "a8dd435c-a454-4ae4-935a-67c1f9c9ec81") : secret "cluster-monitoring-operator-tls" not found May 11 20:50:58.663883 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:58.663859 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9qsqx"] May 11 20:50:58.666720 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:50:58.666663 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd85defd8_e86e_4d13_9e13_373afa866baa.slice/crio-88f3dd877e2d819f1bd0f576189b6308b1009ee96ec96f5ffacbff37d7fc9fc4 WatchSource:0}: Error finding container 88f3dd877e2d819f1bd0f576189b6308b1009ee96ec96f5ffacbff37d7fc9fc4: Status 404 returned error can't find the container with id 88f3dd877e2d819f1bd0f576189b6308b1009ee96ec96f5ffacbff37d7fc9fc4 May 11 20:50:58.960661 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:58.960629 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:50:58.960829 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:58.960707 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:50:58.960829 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:58.960762 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:50:58.960829 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:58.960785 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 11 20:50:58.960981 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:58.960843 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls podName:98bf1c69-fee9-4051-8fbc-c2fffe394f8b nodeName:}" failed. No retries permitted until 2026-05-11 20:51:06.960826211 +0000 UTC m=+50.258970920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-tghxw" (UID: "98bf1c69-fee9-4051-8fbc-c2fffe394f8b") : secret "samples-operator-tls" not found May 11 20:50:58.960981 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:58.960847 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:50:58.960981 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:58.960859 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bb9d58f7-w68mc: secret "image-registry-tls" not found May 11 20:50:58.960981 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:58.960866 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:50:58.960981 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:58.960892 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls podName:54fd2ae3-b688-40a5-b542-77c19799ef8a nodeName:}" failed. No retries permitted until 2026-05-11 20:51:06.960881427 +0000 UTC m=+50.259026130 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls") pod "image-registry-8bb9d58f7-w68mc" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a") : secret "image-registry-tls" not found May 11 20:50:58.960981 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:58.960925 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls podName:7ff2f12c-70ce-4a2c-8828-4562a60dc95d nodeName:}" failed. No retries permitted until 2026-05-11 20:51:06.960908206 +0000 UTC m=+50.259052914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls") pod "dns-default-n8sxs" (UID: "7ff2f12c-70ce-4a2c-8828-4562a60dc95d") : secret "dns-default-metrics-tls" not found May 11 20:50:59.065074 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.064984 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:50:59.065074 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.065052 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:59.065265 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.065095 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:50:59.065265 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.065134 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:50:59.065265 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.065240 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:50:59.065408 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:59.065370 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 11 20:50:59.065457 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:59.065425 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert podName:17f13489-a9eb-4f66-85c8-6967aa3ec01a nodeName:}" failed. No retries permitted until 2026-05-11 20:51:07.065407032 +0000 UTC m=+50.363551726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-5q7f6" (UID: "17f13489-a9eb-4f66-85c8-6967aa3ec01a") : secret "networking-console-plugin-cert" not found May 11 20:50:59.065987 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:59.065690 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:50:59.065987 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:59.065730 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert podName:9d81ee0b-b7dc-45a9-bc60-e7389a88feb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:07.065718218 +0000 UTC m=+50.363862917 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert") pod "ingress-canary-p92nn" (UID: "9d81ee0b-b7dc-45a9-bc60-e7389a88feb1") : secret "canary-serving-cert" not found May 11 20:50:59.065987 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:59.065779 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 11 20:50:59.065987 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:59.065808 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:07.065799244 +0000 UTC m=+50.363943940 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : secret "router-metrics-certs-default" not found May 11 20:50:59.065987 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:59.065860 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 11 20:50:59.065987 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:59.065869 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:07.065859665 +0000 UTC m=+50.364004361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : configmap references non-existent config key: service-ca.crt May 11 20:50:59.065987 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:50:59.065950 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls podName:a8dd435c-a454-4ae4-935a-67c1f9c9ec81 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:07.065931425 +0000 UTC m=+50.364076127 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-gldtx" (UID: "a8dd435c-a454-4ae4-935a-67c1f9c9ec81") : secret "cluster-monitoring-operator-tls" not found May 11 20:50:59.471076 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.470999 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" event={"ID":"d7e41df8-69e8-4481-9aa4-0456bce8d7df","Type":"ContainerStarted","Data":"bcd15c89f6873c3d0d683c6322ad9238722065d7e6670e6ad9a4eba65279e0d5"} May 11 20:50:59.472832 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.472804 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9qsqx" event={"ID":"d85defd8-e86e-4d13-9e13-373afa866baa","Type":"ContainerStarted","Data":"88f3dd877e2d819f1bd0f576189b6308b1009ee96ec96f5ffacbff37d7fc9fc4"} May 11 20:50:59.476702 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.476676 2562 generic.go:358] "Generic (PLEG): container finished" podID="96cb7513-d136-4d23-90a5-47ea1604bb7b" containerID="f181c26ee0c7b1e0f276d9094bbd1ebf3c0d39c5e7b633005203a49fb60c2e4e" exitCode=0 May 11 20:50:59.476800 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.476755 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" event={"ID":"96cb7513-d136-4d23-90a5-47ea1604bb7b","Type":"ContainerDied","Data":"f181c26ee0c7b1e0f276d9094bbd1ebf3c0d39c5e7b633005203a49fb60c2e4e"} May 11 20:50:59.481325 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.481271 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp" event={"ID":"c0e49337-8925-47a4-9b6a-95c7bd4e9887","Type":"ContainerStarted","Data":"bfc96a2b4dd766ff17308d8dd027492bf1083bae325b375ee86772fe3fb38402"} May 11 20:50:59.483846 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.483823 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" event={"ID":"ba5697e4-10a6-472f-b7fc-5b4568baf16b","Type":"ContainerStarted","Data":"f7e6e09fcaca547b8dd31e161289ac1331b0599446a145d605783d9cfdad0933"} May 11 20:50:59.485374 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.485345 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-544c98cc96-j75m5" event={"ID":"0311189e-d497-4d2c-a742-ad52f624750a","Type":"ContainerStarted","Data":"f8887b3666fe8266de153830d96f8133100cdbd8a5355dc938e53a65cf4a0f1b"} May 11 20:50:59.487099 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.487079 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/0.log" May 11 20:50:59.487191 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.487114 2562 generic.go:358] "Generic (PLEG): container finished" podID="bffdd990-0f6a-4e43-a62a-94c91746d6fc" containerID="4ac6628412dbb7b80a9769e72f31cc38b73edabf6065639c97cea9183102a6e3" exitCode=255 May 11 20:50:59.487191 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.487167 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" event={"ID":"bffdd990-0f6a-4e43-a62a-94c91746d6fc","Type":"ContainerDied","Data":"4ac6628412dbb7b80a9769e72f31cc38b73edabf6065639c97cea9183102a6e3"} May 11 20:50:59.487471 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.487415 2562 scope.go:117] "RemoveContainer" containerID="4ac6628412dbb7b80a9769e72f31cc38b73edabf6065639c97cea9183102a6e3" May 11 20:50:59.488660 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.488622 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm" event={"ID":"aa10b731-c86e-4be7-b230-1ef6c613b38f","Type":"ContainerStarted","Data":"0bd5da78be33a0dbb7fa19550583eebdcbe39f87e948bfde1d94c4f5726c84d6"} May 11 20:50:59.490936 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.490615 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" podStartSLOduration=36.207570618 podStartE2EDuration="42.490600397s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:52.251129139 +0000 UTC m=+35.549273828" lastFinishedPulling="2026-05-11 20:50:58.534158912 +0000 UTC m=+41.832303607" observedRunningTime="2026-05-11 20:50:59.489588821 +0000 UTC m=+42.787733533" watchObservedRunningTime="2026-05-11 20:50:59.490600397 +0000 UTC m=+42.788745107" May 11 20:50:59.574456 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.572728 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" podStartSLOduration=36.272343434 podStartE2EDuration="42.572708656s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:52.228194575 +0000 UTC m=+35.526339278" lastFinishedPulling="2026-05-11 20:50:58.528559799 +0000 UTC m=+41.826704500" observedRunningTime="2026-05-11 20:50:59.571169383 +0000 UTC m=+42.869314092" watchObservedRunningTime="2026-05-11 20:50:59.572708656 +0000 UTC m=+42.870853367" May 11 20:50:59.592045 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.589196 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-6859b67c86-hzqqm" podStartSLOduration=36.298355216 podStartE2EDuration="42.589178933s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:52.250672176 +0000 UTC m=+35.548816866" lastFinishedPulling="2026-05-11 20:50:58.541495876 +0000 UTC m=+41.839640583" observedRunningTime="2026-05-11 20:50:59.588316851 +0000 UTC m=+42.886461563" watchObservedRunningTime="2026-05-11 20:50:59.589178933 +0000 UTC m=+42.887323643" May 11 20:50:59.616610 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.616457 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-6648d555c9-kgwqp" podStartSLOduration=36.338627379 podStartE2EDuration="42.616440111s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:52.250756124 +0000 UTC m=+35.548900825" lastFinishedPulling="2026-05-11 20:50:58.528568856 +0000 UTC m=+41.826713557" observedRunningTime="2026-05-11 20:50:59.614322189 +0000 UTC m=+42.912466899" watchObservedRunningTime="2026-05-11 20:50:59.616440111 +0000 UTC m=+42.914584819" May 11 20:50:59.640128 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:50:59.640076 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-544c98cc96-j75m5" podStartSLOduration=36.337115958 podStartE2EDuration="42.640058609s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:52.228479612 +0000 UTC m=+35.526624306" lastFinishedPulling="2026-05-11 20:50:58.531422256 +0000 UTC m=+41.829566957" observedRunningTime="2026-05-11 20:50:59.639947339 +0000 UTC m=+42.938092050" watchObservedRunningTime="2026-05-11 20:50:59.640058609 +0000 UTC m=+42.938203320" May 11 20:51:00.499104 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:00.499069 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" event={"ID":"96cb7513-d136-4d23-90a5-47ea1604bb7b","Type":"ContainerStarted","Data":"4432e5ac4fdce4da24cb4695ada4e97891142b1fc5baa32ea4fb1071fff4401a"} May 11 20:51:00.500876 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:00.500853 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/1.log" May 11 20:51:00.501253 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:00.501229 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/0.log" May 11 20:51:00.501352 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:00.501265 2562 generic.go:358] "Generic (PLEG): container finished" podID="bffdd990-0f6a-4e43-a62a-94c91746d6fc" containerID="7304f55c51eb2b7b9440c2e44f39818b9cdab033dbc8dc52aee25cc0a671f2be" exitCode=255 May 11 20:51:00.502495 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:00.501897 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" event={"ID":"bffdd990-0f6a-4e43-a62a-94c91746d6fc","Type":"ContainerDied","Data":"7304f55c51eb2b7b9440c2e44f39818b9cdab033dbc8dc52aee25cc0a671f2be"} May 11 20:51:00.502495 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:00.501940 2562 scope.go:117] "RemoveContainer" containerID="4ac6628412dbb7b80a9769e72f31cc38b73edabf6065639c97cea9183102a6e3" May 11 20:51:00.502495 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:00.502036 2562 scope.go:117] "RemoveContainer" containerID="7304f55c51eb2b7b9440c2e44f39818b9cdab033dbc8dc52aee25cc0a671f2be" May 11 20:51:00.502495 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:00.502261 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-77758f4558-kfkm4_openshift-console-operator(bffdd990-0f6a-4e43-a62a-94c91746d6fc)\"" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" podUID="bffdd990-0f6a-4e43-a62a-94c91746d6fc" May 11 20:51:00.544264 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:00.544208 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m4bgl" podStartSLOduration=11.308883263 podStartE2EDuration="43.544188624s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:20.036822796 +0000 UTC m=+3.334967485" lastFinishedPulling="2026-05-11 20:50:52.272128154 +0000 UTC m=+35.570272846" observedRunningTime="2026-05-11 20:51:00.540765346 +0000 UTC m=+43.838910055" watchObservedRunningTime="2026-05-11 20:51:00.544188624 +0000 UTC m=+43.842333335" May 11 20:51:01.364403 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:01.364372 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zhs5x_9d55bb28-de13-44d3-9322-9b22abc5dc03/dns-node-resolver/0.log" May 11 20:51:01.414176 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:01.414149 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:51:01.414176 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:01.414181 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:51:01.505618 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:01.505593 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/1.log" May 11 20:51:01.506107 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:01.505994 2562 scope.go:117] "RemoveContainer" containerID="7304f55c51eb2b7b9440c2e44f39818b9cdab033dbc8dc52aee25cc0a671f2be" May 11 20:51:01.506266 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:01.506244 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-77758f4558-kfkm4_openshift-console-operator(bffdd990-0f6a-4e43-a62a-94c91746d6fc)\"" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" podUID="bffdd990-0f6a-4e43-a62a-94c91746d6fc" May 11 20:51:02.156294 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:02.156264 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-p6nhn_a64fefbb-edf9-4ffa-adf6-0602e2c7e71b/node-ca/0.log" May 11 20:51:02.509429 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:02.509401 2562 scope.go:117] "RemoveContainer" containerID="7304f55c51eb2b7b9440c2e44f39818b9cdab033dbc8dc52aee25cc0a671f2be" May 11 20:51:02.509847 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:02.509643 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-77758f4558-kfkm4_openshift-console-operator(bffdd990-0f6a-4e43-a62a-94c91746d6fc)\"" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" podUID="bffdd990-0f6a-4e43-a62a-94c91746d6fc" May 11 20:51:03.513385 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:03.513351 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9qsqx" event={"ID":"d85defd8-e86e-4d13-9e13-373afa866baa","Type":"ContainerStarted","Data":"df70c34d66b6ededc257104b71c39676e0d5a79c7af418c0859490c9a9eb2050"} May 11 20:51:03.529903 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:03.529856 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9qsqx" podStartSLOduration=37.080030879 podStartE2EDuration="41.529843517s" podCreationTimestamp="2026-05-11 20:50:22 +0000 UTC" firstStartedPulling="2026-05-11 20:50:58.668721465 +0000 UTC m=+41.966866157" lastFinishedPulling="2026-05-11 20:51:03.118534107 +0000 UTC m=+46.416678795" observedRunningTime="2026-05-11 20:51:03.529087749 +0000 UTC m=+46.827232459" watchObservedRunningTime="2026-05-11 20:51:03.529843517 +0000 UTC m=+46.827988227" May 11 20:51:07.042950 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:07.042908 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:51:07.043433 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:07.042991 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:51:07.043433 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.043072 2562 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found May 11 20:51:07.043433 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:07.043104 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:51:07.043433 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.043126 2562 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found May 11 20:51:07.043433 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.043141 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls podName:7ff2f12c-70ce-4a2c-8828-4562a60dc95d nodeName:}" failed. No retries permitted until 2026-05-11 20:51:23.043125075 +0000 UTC m=+66.341269768 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls") pod "dns-default-n8sxs" (UID: "7ff2f12c-70ce-4a2c-8828-4562a60dc95d") : secret "dns-default-metrics-tls" not found May 11 20:51:07.043433 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.043144 2562 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8bb9d58f7-w68mc: secret "image-registry-tls" not found May 11 20:51:07.043433 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.043196 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls podName:54fd2ae3-b688-40a5-b542-77c19799ef8a nodeName:}" failed. No retries permitted until 2026-05-11 20:51:23.043176433 +0000 UTC m=+66.341321135 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls") pod "image-registry-8bb9d58f7-w68mc" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a") : secret "image-registry-tls" not found May 11 20:51:07.043433 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.043214 2562 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found May 11 20:51:07.043433 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.043282 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls podName:98bf1c69-fee9-4051-8fbc-c2fffe394f8b nodeName:}" failed. No retries permitted until 2026-05-11 20:51:23.043268268 +0000 UTC m=+66.341412972 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls") pod "cluster-samples-operator-5699b6b9d9-tghxw" (UID: "98bf1c69-fee9-4051-8fbc-c2fffe394f8b") : secret "samples-operator-tls" not found May 11 20:51:07.144304 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:07.144275 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:07.144487 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:07.144318 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:51:07.144487 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:07.144378 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:51:07.144487 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:07.144411 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:51:07.144487 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:07.144431 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:07.144487 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.144477 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:23.144438846 +0000 UTC m=+66.442583555 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : configmap references non-existent config key: service-ca.crt May 11 20:51:07.144723 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.144487 2562 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found May 11 20:51:07.144723 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.144530 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls podName:a8dd435c-a454-4ae4-935a-67c1f9c9ec81 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:23.144519781 +0000 UTC m=+66.442664479 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-5c487d988c-gldtx" (UID: "a8dd435c-a454-4ae4-935a-67c1f9c9ec81") : secret "cluster-monitoring-operator-tls" not found May 11 20:51:07.144723 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.144529 2562 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found May 11 20:51:07.144723 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.144543 2562 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found May 11 20:51:07.144723 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.144495 2562 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found May 11 20:51:07.144723 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.144586 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert podName:9d81ee0b-b7dc-45a9-bc60-e7389a88feb1 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:23.14456725 +0000 UTC m=+66.442711952 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert") pod "ingress-canary-p92nn" (UID: "9d81ee0b-b7dc-45a9-bc60-e7389a88feb1") : secret "canary-serving-cert" not found May 11 20:51:07.144723 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.144647 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert podName:17f13489-a9eb-4f66-85c8-6967aa3ec01a nodeName:}" failed. No retries permitted until 2026-05-11 20:51:23.144630123 +0000 UTC m=+66.442774819 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert") pod "networking-console-plugin-697665887d-5q7f6" (UID: "17f13489-a9eb-4f66-85c8-6967aa3ec01a") : secret "networking-console-plugin-cert" not found May 11 20:51:07.144723 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:07.144664 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs podName:ea8d418d-c309-4692-8be3-e3a7eeb22225 nodeName:}" failed. No retries permitted until 2026-05-11 20:51:23.144654286 +0000 UTC m=+66.442798983 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs") pod "router-default-7589cfd5f4-pdcvt" (UID: "ea8d418d-c309-4692-8be3-e3a7eeb22225") : secret "router-metrics-certs-default" not found May 11 20:51:16.309786 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:16.309752 2562 scope.go:117] "RemoveContainer" containerID="7304f55c51eb2b7b9440c2e44f39818b9cdab033dbc8dc52aee25cc0a671f2be" May 11 20:51:16.545773 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:16.545745 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 20:51:16.546142 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:16.546128 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/1.log" May 11 20:51:16.546223 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:16.546158 2562 generic.go:358] "Generic (PLEG): container finished" podID="bffdd990-0f6a-4e43-a62a-94c91746d6fc" containerID="06bc74605e0fdaec6e282d807972f2829b5176a3e81d7dc1068a26b9a0a68da3" exitCode=255 May 11 20:51:16.546223 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:16.546185 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" event={"ID":"bffdd990-0f6a-4e43-a62a-94c91746d6fc","Type":"ContainerDied","Data":"06bc74605e0fdaec6e282d807972f2829b5176a3e81d7dc1068a26b9a0a68da3"} May 11 20:51:16.546223 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:16.546211 2562 scope.go:117] "RemoveContainer" containerID="7304f55c51eb2b7b9440c2e44f39818b9cdab033dbc8dc52aee25cc0a671f2be" May 11 20:51:16.546558 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:16.546540 2562 scope.go:117] "RemoveContainer" containerID="06bc74605e0fdaec6e282d807972f2829b5176a3e81d7dc1068a26b9a0a68da3" May 11 20:51:16.546756 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:16.546738 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-77758f4558-kfkm4_openshift-console-operator(bffdd990-0f6a-4e43-a62a-94c91746d6fc)\"" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" podUID="bffdd990-0f6a-4e43-a62a-94c91746d6fc" May 11 20:51:17.550922 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:17.550894 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 20:51:18.439109 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:18.439079 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-knpxn" May 11 20:51:21.414096 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:21.414063 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:51:21.414096 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:21.414107 2562 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:51:21.414649 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:21.414511 2562 scope.go:117] "RemoveContainer" containerID="06bc74605e0fdaec6e282d807972f2829b5176a3e81d7dc1068a26b9a0a68da3" May 11 20:51:21.414749 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:21.414728 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-77758f4558-kfkm4_openshift-console-operator(bffdd990-0f6a-4e43-a62a-94c91746d6fc)\"" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" podUID="bffdd990-0f6a-4e43-a62a-94c91746d6fc" May 11 20:51:23.083333 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.083290 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:51:23.083815 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.083378 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:51:23.083815 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.083423 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:51:23.086454 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.086420 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/98bf1c69-fee9-4051-8fbc-c2fffe394f8b-samples-operator-tls\") pod \"cluster-samples-operator-5699b6b9d9-tghxw\" (UID: \"98bf1c69-fee9-4051-8fbc-c2fffe394f8b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:51:23.086568 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.086423 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ff2f12c-70ce-4a2c-8828-4562a60dc95d-metrics-tls\") pod \"dns-default-n8sxs\" (UID: \"7ff2f12c-70ce-4a2c-8828-4562a60dc95d\") " pod="openshift-dns/dns-default-n8sxs" May 11 20:51:23.086568 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.086490 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") pod \"image-registry-8bb9d58f7-w68mc\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:51:23.183791 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.183757 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvs7\" (UniqueName: \"kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7\") pod \"network-check-target-xw5qw\" (UID: \"bae2e16d-3454-4522-88aa-1afafb2e9cb1\") " pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:51:23.183791 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.183797 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:23.184001 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.183818 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:51:23.184001 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.183870 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:51:23.184001 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.183889 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:51:23.184001 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.183926 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:51:23.184001 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.183946 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:23.186418 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.186390 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ea8d418d-c309-4692-8be3-e3a7eeb22225-metrics-certs\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:23.186555 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.186462 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d81ee0b-b7dc-45a9-bc60-e7389a88feb1-cert\") pod \"ingress-canary-p92nn\" (UID: \"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1\") " pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:51:23.186607 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.186591 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" May 11 20:51:23.186648 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.186610 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a8dd435c-a454-4ae4-935a-67c1f9c9ec81-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-5c487d988c-gldtx\" (UID: \"a8dd435c-a454-4ae4-935a-67c1f9c9ec81\") " pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:51:23.186726 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.186705 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/17f13489-a9eb-4f66-85c8-6967aa3ec01a-networking-console-plugin-cert\") pod \"networking-console-plugin-697665887d-5q7f6\" (UID: \"17f13489-a9eb-4f66-85c8-6967aa3ec01a\") " pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:51:23.186942 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.186927 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tvs7\" (UniqueName: \"kubernetes.io/projected/bae2e16d-3454-4522-88aa-1afafb2e9cb1-kube-api-access-6tvs7\") pod \"network-check-target-xw5qw\" (UID: \"bae2e16d-3454-4522-88aa-1afafb2e9cb1\") " pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:51:23.195120 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.195096 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ea8d418d-c309-4692-8be3-e3a7eeb22225-service-ca-bundle\") pod \"router-default-7589cfd5f4-pdcvt\" (UID: \"ea8d418d-c309-4692-8be3-e3a7eeb22225\") " pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:23.196464 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.196446 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3be5f296-2151-4f3e-b028-c72728d855da-metrics-certs\") pod \"network-metrics-daemon-2ccqq\" (UID: \"3be5f296-2151-4f3e-b028-c72728d855da\") " pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:51:23.227796 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.227767 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bdn2f\"" May 11 20:51:23.235814 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.235793 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" May 11 20:51:23.241619 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.241599 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mdvls\"" May 11 20:51:23.249476 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.249451 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kqkzb\"" May 11 20:51:23.249476 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.249465 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2ccqq" May 11 20:51:23.258316 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.258289 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:51:23.269376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.269353 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wsq67\"" May 11 20:51:23.277369 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.277314 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-n8sxs" May 11 20:51:23.284551 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.284521 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6vnrf\"" May 11 20:51:23.293505 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.293323 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:51:23.333606 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.333324 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-974lw\"" May 11 20:51:23.341141 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.341115 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:23.350057 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.348815 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-td7cg\"" May 11 20:51:23.355670 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.355642 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" May 11 20:51:23.373238 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.373031 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-82xzn\"" May 11 20:51:23.378634 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.377934 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" May 11 20:51:23.380772 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.380534 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qhtb5\"" May 11 20:51:23.401421 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.400858 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p92nn" May 11 20:51:23.425045 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.424525 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw"] May 11 20:51:23.446465 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.446411 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2ccqq"] May 11 20:51:23.486743 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.485446 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xw5qw"] May 11 20:51:23.508507 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:51:23.507529 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbae2e16d_3454_4522_88aa_1afafb2e9cb1.slice/crio-f4c8766b32d9d9f0dfa529686e813783110f70027056759b238bc62043ee2492 WatchSource:0}: Error finding container f4c8766b32d9d9f0dfa529686e813783110f70027056759b238bc62043ee2492: Status 404 returned error can't find the container with id f4c8766b32d9d9f0dfa529686e813783110f70027056759b238bc62043ee2492 May 11 20:51:23.532712 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.532356 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-n8sxs"] May 11 20:51:23.537878 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:51:23.537829 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff2f12c_70ce_4a2c_8828_4562a60dc95d.slice/crio-eb338a9b0b2f05c29deb3cd45c43ecf34c6f9d9b9204275176dee1a1a67f594e WatchSource:0}: Error finding container eb338a9b0b2f05c29deb3cd45c43ecf34c6f9d9b9204275176dee1a1a67f594e: Status 404 returned error can't find the container with id eb338a9b0b2f05c29deb3cd45c43ecf34c6f9d9b9204275176dee1a1a67f594e May 11 20:51:23.554889 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.553696 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8bb9d58f7-w68mc"] May 11 20:51:23.592456 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.592414 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n8sxs" event={"ID":"7ff2f12c-70ce-4a2c-8828-4562a60dc95d","Type":"ContainerStarted","Data":"eb338a9b0b2f05c29deb3cd45c43ecf34c6f9d9b9204275176dee1a1a67f594e"} May 11 20:51:23.593855 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.593798 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xw5qw" event={"ID":"bae2e16d-3454-4522-88aa-1afafb2e9cb1","Type":"ContainerStarted","Data":"f4c8766b32d9d9f0dfa529686e813783110f70027056759b238bc62043ee2492"} May 11 20:51:23.595934 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.595887 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" event={"ID":"98bf1c69-fee9-4051-8fbc-c2fffe394f8b","Type":"ContainerStarted","Data":"77ee467959db4108d97766ab4ef04dbc9607f37c4f9dff710a68a0af1c025d2d"} May 11 20:51:23.597701 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.597676 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2ccqq" event={"ID":"3be5f296-2151-4f3e-b028-c72728d855da","Type":"ContainerStarted","Data":"d78b84066fe3449c88a9cb0417907606ff4d8229c1e9b60ae68ee99af6c32de1"} May 11 20:51:23.623630 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.623332 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-697665887d-5q7f6"] May 11 20:51:23.626593 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:51:23.626558 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f13489_a9eb_4f66_85c8_6967aa3ec01a.slice/crio-da32ca3f048ef22fc13cecca39ed15d442bae6ccf3388ba2aff5f58acf609079 WatchSource:0}: Error finding container da32ca3f048ef22fc13cecca39ed15d442bae6ccf3388ba2aff5f58acf609079: Status 404 returned error can't find the container with id da32ca3f048ef22fc13cecca39ed15d442bae6ccf3388ba2aff5f58acf609079 May 11 20:51:23.643656 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.643633 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx"] May 11 20:51:23.646646 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.646618 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7589cfd5f4-pdcvt"] May 11 20:51:23.655271 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:51:23.655242 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8dd435c_a454_4ae4_935a_67c1f9c9ec81.slice/crio-ea982b879520a2ecad96102f82e321fd6d7a5089439fc16c5d79b2ac32087e70 WatchSource:0}: Error finding container ea982b879520a2ecad96102f82e321fd6d7a5089439fc16c5d79b2ac32087e70: Status 404 returned error can't find the container with id ea982b879520a2ecad96102f82e321fd6d7a5089439fc16c5d79b2ac32087e70 May 11 20:51:23.655491 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:51:23.655468 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea8d418d_c309_4692_8be3_e3a7eeb22225.slice/crio-3739060b5bf3d653e20fac7cb44d76e9bb297a77b77de551115d13c15689b0a1 WatchSource:0}: Error finding container 3739060b5bf3d653e20fac7cb44d76e9bb297a77b77de551115d13c15689b0a1: Status 404 returned error can't find the container with id 3739060b5bf3d653e20fac7cb44d76e9bb297a77b77de551115d13c15689b0a1 May 11 20:51:23.660057 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.659993 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p92nn"] May 11 20:51:23.663963 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:51:23.663938 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d81ee0b_b7dc_45a9_bc60_e7389a88feb1.slice/crio-e117af5b39a84cc2c71dd7c51264c4fbc40f06d68d7525ed35031cdd60516b47 WatchSource:0}: Error finding container e117af5b39a84cc2c71dd7c51264c4fbc40f06d68d7525ed35031cdd60516b47: Status 404 returned error can't find the container with id e117af5b39a84cc2c71dd7c51264c4fbc40f06d68d7525ed35031cdd60516b47 May 11 20:51:23.966239 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.966207 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d"] May 11 20:51:23.989810 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.989773 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d"] May 11 20:51:23.990000 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.989916 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:23.992538 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.992508 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" May 11 20:51:23.992781 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.992763 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" May 11 20:51:23.992859 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.992785 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" May 11 20:51:23.992911 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.992891 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" May 11 20:51:23.992970 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.992942 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" May 11 20:51:23.993686 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.993660 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" May 11 20:51:23.993850 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:23.993676 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" May 11 20:51:24.090905 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.090581 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.090905 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.090633 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-hub\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.090905 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.090659 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d15eaa12-fd27-44e5-a9dd-e75da067a870-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.090905 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.090698 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4n6p\" (UniqueName: \"kubernetes.io/projected/d15eaa12-fd27-44e5-a9dd-e75da067a870-kube-api-access-g4n6p\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.090905 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.090745 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.090905 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.090794 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-ca\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.097330 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.097166 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jjq7d"] May 11 20:51:24.108390 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.108359 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.111911 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.111435 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s999c\"" May 11 20:51:24.111911 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.111468 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" May 11 20:51:24.111911 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.111782 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" May 11 20:51:24.114210 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.114165 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jjq7d"] May 11 20:51:24.191720 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.191672 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.191720 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.191721 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-hub\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.191978 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.191801 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d15eaa12-fd27-44e5-a9dd-e75da067a870-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.191978 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.191859 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4n6p\" (UniqueName: \"kubernetes.io/projected/d15eaa12-fd27-44e5-a9dd-e75da067a870-kube-api-access-g4n6p\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.192613 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.192233 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36a1e166-0e33-4883-8711-cbbba5eb371c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.192613 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.192290 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.192613 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.192379 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36a1e166-0e33-4883-8711-cbbba5eb371c-data-volume\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.192613 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.192427 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-ca\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.192613 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.192469 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36a1e166-0e33-4883-8711-cbbba5eb371c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.192613 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.192498 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36a1e166-0e33-4883-8711-cbbba5eb371c-crio-socket\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.192613 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.192550 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/d15eaa12-fd27-44e5-a9dd-e75da067a870-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.192613 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.192580 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7djb8\" (UniqueName: \"kubernetes.io/projected/36a1e166-0e33-4883-8711-cbbba5eb371c-kube-api-access-7djb8\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.195443 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.195391 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.195587 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.195563 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-ca\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.195785 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.195757 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.196282 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.196251 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/d15eaa12-fd27-44e5-a9dd-e75da067a870-hub\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.202692 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.202665 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4n6p\" (UniqueName: \"kubernetes.io/projected/d15eaa12-fd27-44e5-a9dd-e75da067a870-kube-api-access-g4n6p\") pod \"cluster-proxy-proxy-agent-57cffc5786-pf68d\" (UID: \"d15eaa12-fd27-44e5-a9dd-e75da067a870\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.293655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.293578 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36a1e166-0e33-4883-8711-cbbba5eb371c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.293655 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.293635 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36a1e166-0e33-4883-8711-cbbba5eb371c-data-volume\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.293865 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.293681 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36a1e166-0e33-4883-8711-cbbba5eb371c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.293865 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.293706 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36a1e166-0e33-4883-8711-cbbba5eb371c-crio-socket\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.293865 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.293741 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7djb8\" (UniqueName: \"kubernetes.io/projected/36a1e166-0e33-4883-8711-cbbba5eb371c-kube-api-access-7djb8\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.294033 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.293911 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36a1e166-0e33-4883-8711-cbbba5eb371c-crio-socket\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.294116 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.294076 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36a1e166-0e33-4883-8711-cbbba5eb371c-data-volume\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.294378 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.294356 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36a1e166-0e33-4883-8711-cbbba5eb371c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.296663 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.296618 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36a1e166-0e33-4883-8711-cbbba5eb371c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.304956 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.304907 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7djb8\" (UniqueName: \"kubernetes.io/projected/36a1e166-0e33-4883-8711-cbbba5eb371c-kube-api-access-7djb8\") pod \"insights-runtime-extractor-jjq7d\" (UID: \"36a1e166-0e33-4883-8711-cbbba5eb371c\") " pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.316079 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.315979 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" May 11 20:51:24.437867 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.437837 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jjq7d" May 11 20:51:24.491890 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.491832 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d"] May 11 20:51:24.494544 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:51:24.494509 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd15eaa12_fd27_44e5_a9dd_e75da067a870.slice/crio-bff435b590db85aa716e1cf64a1e26f82b39b047d7927448115be648a0a5a9ba WatchSource:0}: Error finding container bff435b590db85aa716e1cf64a1e26f82b39b047d7927448115be648a0a5a9ba: Status 404 returned error can't find the container with id bff435b590db85aa716e1cf64a1e26f82b39b047d7927448115be648a0a5a9ba May 11 20:51:24.609255 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.609185 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" event={"ID":"d15eaa12-fd27-44e5-a9dd-e75da067a870","Type":"ContainerStarted","Data":"bff435b590db85aa716e1cf64a1e26f82b39b047d7927448115be648a0a5a9ba"} May 11 20:51:24.615336 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.615302 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p92nn" event={"ID":"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1","Type":"ContainerStarted","Data":"e117af5b39a84cc2c71dd7c51264c4fbc40f06d68d7525ed35031cdd60516b47"} May 11 20:51:24.619111 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.619031 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" event={"ID":"ea8d418d-c309-4692-8be3-e3a7eeb22225","Type":"ContainerStarted","Data":"4d22c40f6a1daa4a02d89464f4fddacf5e4cc045349fb32c97de2471d1b34c72"} May 11 20:51:24.619111 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.619072 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" event={"ID":"ea8d418d-c309-4692-8be3-e3a7eeb22225","Type":"ContainerStarted","Data":"3739060b5bf3d653e20fac7cb44d76e9bb297a77b77de551115d13c15689b0a1"} May 11 20:51:24.625094 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.624614 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" event={"ID":"54fd2ae3-b688-40a5-b542-77c19799ef8a","Type":"ContainerStarted","Data":"cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88"} May 11 20:51:24.625094 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.624651 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" event={"ID":"54fd2ae3-b688-40a5-b542-77c19799ef8a","Type":"ContainerStarted","Data":"590938bc2153fafe07617a7d6c2292d409dc36d7688e6186f8984540b1843e4e"} May 11 20:51:24.625094 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.624930 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:51:24.626376 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.626350 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" event={"ID":"a8dd435c-a454-4ae4-935a-67c1f9c9ec81","Type":"ContainerStarted","Data":"ea982b879520a2ecad96102f82e321fd6d7a5089439fc16c5d79b2ac32087e70"} May 11 20:51:24.635734 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.633786 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" event={"ID":"17f13489-a9eb-4f66-85c8-6967aa3ec01a","Type":"ContainerStarted","Data":"da32ca3f048ef22fc13cecca39ed15d442bae6ccf3388ba2aff5f58acf609079"} May 11 20:51:24.635734 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.634256 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jjq7d"] May 11 20:51:24.641349 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.641294 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" podStartSLOduration=67.641277747 podStartE2EDuration="1m7.641277747s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:51:24.640943717 +0000 UTC m=+67.939088429" watchObservedRunningTime="2026-05-11 20:51:24.641277747 +0000 UTC m=+67.939422455" May 11 20:51:24.641664 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.641571 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xw5qw" event={"ID":"bae2e16d-3454-4522-88aa-1afafb2e9cb1","Type":"ContainerStarted","Data":"25e723924247d4a25ba49a88bbb0c2139cdb58cdc227fd7c80c6ffa362557194"} May 11 20:51:24.642145 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.642125 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:51:24.665520 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.665324 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" podStartSLOduration=67.665285938 podStartE2EDuration="1m7.665285938s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:51:24.66406444 +0000 UTC m=+67.962209149" watchObservedRunningTime="2026-05-11 20:51:24.665285938 +0000 UTC m=+67.963430650" May 11 20:51:24.685751 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:24.685381 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xw5qw" podStartSLOduration=67.685361321 podStartE2EDuration="1m7.685361321s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:51:24.683812448 +0000 UTC m=+67.981957160" watchObservedRunningTime="2026-05-11 20:51:24.685361321 +0000 UTC m=+67.983506032" May 11 20:51:25.342378 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:25.342278 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:25.345852 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:25.345665 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:25.644810 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:25.644706 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:25.645809 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:25.645790 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7589cfd5f4-pdcvt" May 11 20:51:25.651146 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:51:25.651124 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36a1e166_0e33_4883_8711_cbbba5eb371c.slice/crio-11ee54a800d6a569f5a6b66f4ef3c76b829c82059f5232f7b3259c8e5b5a1ba3 WatchSource:0}: Error finding container 11ee54a800d6a569f5a6b66f4ef3c76b829c82059f5232f7b3259c8e5b5a1ba3: Status 404 returned error can't find the container with id 11ee54a800d6a569f5a6b66f4ef3c76b829c82059f5232f7b3259c8e5b5a1ba3 May 11 20:51:26.649090 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:26.649050 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjq7d" event={"ID":"36a1e166-0e33-4883-8711-cbbba5eb371c","Type":"ContainerStarted","Data":"11ee54a800d6a569f5a6b66f4ef3c76b829c82059f5232f7b3259c8e5b5a1ba3"} May 11 20:51:30.660319 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:30.660275 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" event={"ID":"a8dd435c-a454-4ae4-935a-67c1f9c9ec81","Type":"ContainerStarted","Data":"c4a09c0ca5400a2daafdd4b2a2703e23575352a1a7ef9b4ae0293e7bf06991a5"} May 11 20:51:30.661639 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:30.661605 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" event={"ID":"98bf1c69-fee9-4051-8fbc-c2fffe394f8b","Type":"ContainerStarted","Data":"36f0aa0d6e73dc6f43696d7bcd2bae2f191e24d70e2b3b3c68802c5038e29a2c"} May 11 20:51:30.663085 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:30.663063 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2ccqq" event={"ID":"3be5f296-2151-4f3e-b028-c72728d855da","Type":"ContainerStarted","Data":"4be516a0dea3edd67060f57b0c183494138dda92fa1c643bd3052d430c24bee6"} May 11 20:51:30.664797 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:30.664528 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjq7d" event={"ID":"36a1e166-0e33-4883-8711-cbbba5eb371c","Type":"ContainerStarted","Data":"510dfff050edbc3a80e17da99c80d34269e753bff6757629789a69823e08e302"} May 11 20:51:30.665996 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:30.665972 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" event={"ID":"17f13489-a9eb-4f66-85c8-6967aa3ec01a","Type":"ContainerStarted","Data":"840c64bceebfb533f92650d4c0e872e07049a1fb25a13fe874899e69cd5ee543"} May 11 20:51:30.667293 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:30.667267 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n8sxs" event={"ID":"7ff2f12c-70ce-4a2c-8828-4562a60dc95d","Type":"ContainerStarted","Data":"c0a00731e3e8f6ca3a8173ac069d1a63074376aed7e31190afd67a421c78f182"} May 11 20:51:30.668517 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:30.668496 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p92nn" event={"ID":"9d81ee0b-b7dc-45a9-bc60-e7389a88feb1","Type":"ContainerStarted","Data":"1d55a752835ae534b2e781c9a8e2a9171056091ab7e156c07eb3c05ee66b0e17"} May 11 20:51:30.677737 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:30.677685 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-5c487d988c-gldtx" podStartSLOduration=67.02126934 podStartE2EDuration="1m13.677668372s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:51:23.657035963 +0000 UTC m=+66.955180652" lastFinishedPulling="2026-05-11 20:51:30.31343498 +0000 UTC m=+73.611579684" observedRunningTime="2026-05-11 20:51:30.677641947 +0000 UTC m=+73.975786656" watchObservedRunningTime="2026-05-11 20:51:30.677668372 +0000 UTC m=+73.975813084" May 11 20:51:30.697031 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:30.696977 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p92nn" podStartSLOduration=33.741438431 podStartE2EDuration="39.696966783s" podCreationTimestamp="2026-05-11 20:50:51 +0000 UTC" firstStartedPulling="2026-05-11 20:51:23.665203924 +0000 UTC m=+66.963348613" lastFinishedPulling="2026-05-11 20:51:29.620732267 +0000 UTC m=+72.918876965" observedRunningTime="2026-05-11 20:51:30.696561863 +0000 UTC m=+73.994706575" watchObservedRunningTime="2026-05-11 20:51:30.696966783 +0000 UTC m=+73.995111493" May 11 20:51:30.712731 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:30.712682 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-697665887d-5q7f6" podStartSLOduration=57.033061534 podStartE2EDuration="1m3.712664657s" podCreationTimestamp="2026-05-11 20:50:27 +0000 UTC" firstStartedPulling="2026-05-11 20:51:23.632207802 +0000 UTC m=+66.930352491" lastFinishedPulling="2026-05-11 20:51:30.311810924 +0000 UTC m=+73.609955614" observedRunningTime="2026-05-11 20:51:30.711819486 +0000 UTC m=+74.009964198" watchObservedRunningTime="2026-05-11 20:51:30.712664657 +0000 UTC m=+74.010809371" May 11 20:51:31.673100 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:31.673020 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" event={"ID":"98bf1c69-fee9-4051-8fbc-c2fffe394f8b","Type":"ContainerStarted","Data":"db9f5cbd9fd583285f43d73580817eb82dab6bc75531ea1ef18ee765f364abbf"} May 11 20:51:31.674490 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:31.674456 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2ccqq" event={"ID":"3be5f296-2151-4f3e-b028-c72728d855da","Type":"ContainerStarted","Data":"82031b97bf7a50f7bcdf4262572281fe9c322618c9b1f3eab5735416e39af35a"} May 11 20:51:31.675931 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:31.675908 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjq7d" event={"ID":"36a1e166-0e33-4883-8711-cbbba5eb371c","Type":"ContainerStarted","Data":"8a3010ef2155729cd81dfa8bbbf4e640787f3ecc09a7b32bda2f43672fe7e2b7"} May 11 20:51:31.677204 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:31.677185 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" event={"ID":"d15eaa12-fd27-44e5-a9dd-e75da067a870","Type":"ContainerStarted","Data":"3eebb919147a82dae34c9838131d623e5db777c533078028e385fef848321396"} May 11 20:51:31.678589 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:31.678559 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-n8sxs" event={"ID":"7ff2f12c-70ce-4a2c-8828-4562a60dc95d","Type":"ContainerStarted","Data":"7405c08a3347e29bb286ea0815234e35c3c61d5280629fd96f3dfa53a5006df8"} May 11 20:51:31.678683 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:31.678669 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-n8sxs" May 11 20:51:31.689809 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:31.689772 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-5699b6b9d9-tghxw" podStartSLOduration=67.950495632 podStartE2EDuration="1m14.689761187s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:51:23.576024742 +0000 UTC m=+66.874169445" lastFinishedPulling="2026-05-11 20:51:30.315290304 +0000 UTC m=+73.613435000" observedRunningTime="2026-05-11 20:51:31.688760499 +0000 UTC m=+74.986905211" watchObservedRunningTime="2026-05-11 20:51:31.689761187 +0000 UTC m=+74.987905898" May 11 20:51:31.704004 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:31.703941 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2ccqq" podStartSLOduration=67.876020913 podStartE2EDuration="1m14.703929606s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:51:23.483901058 +0000 UTC m=+66.782045754" lastFinishedPulling="2026-05-11 20:51:30.311809755 +0000 UTC m=+73.609954447" observedRunningTime="2026-05-11 20:51:31.703506267 +0000 UTC m=+75.001650978" watchObservedRunningTime="2026-05-11 20:51:31.703929606 +0000 UTC m=+75.002074318" May 11 20:51:31.719145 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:31.719103 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-n8sxs" podStartSLOduration=34.639063127 podStartE2EDuration="40.719092826s" podCreationTimestamp="2026-05-11 20:50:51 +0000 UTC" firstStartedPulling="2026-05-11 20:51:23.540698616 +0000 UTC m=+66.838843304" lastFinishedPulling="2026-05-11 20:51:29.620728301 +0000 UTC m=+72.918873003" observedRunningTime="2026-05-11 20:51:31.718104392 +0000 UTC m=+75.016249113" watchObservedRunningTime="2026-05-11 20:51:31.719092826 +0000 UTC m=+75.017237536" May 11 20:51:34.698665 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:34.698634 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jjq7d" event={"ID":"36a1e166-0e33-4883-8711-cbbba5eb371c","Type":"ContainerStarted","Data":"9b596526e0f1bff0be5d48983d7dcb6ed617ebbbe104bfddb0087b20cdf5cfa8"} May 11 20:51:34.700460 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:34.700433 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" event={"ID":"d15eaa12-fd27-44e5-a9dd-e75da067a870","Type":"ContainerStarted","Data":"e75fdc8998c4564557e93777c0aab6fd4efe14a14e3a0780e4101dafb14fa47c"} May 11 20:51:34.715953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:34.715904 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jjq7d" podStartSLOduration=6.573910418 podStartE2EDuration="10.715888836s" podCreationTimestamp="2026-05-11 20:51:24 +0000 UTC" firstStartedPulling="2026-05-11 20:51:30.414227348 +0000 UTC m=+73.712372038" lastFinishedPulling="2026-05-11 20:51:34.556205767 +0000 UTC m=+77.854350456" observedRunningTime="2026-05-11 20:51:34.71451049 +0000 UTC m=+78.012655203" watchObservedRunningTime="2026-05-11 20:51:34.715888836 +0000 UTC m=+78.014033549" May 11 20:51:35.310495 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:35.310460 2562 scope.go:117] "RemoveContainer" containerID="06bc74605e0fdaec6e282d807972f2829b5176a3e81d7dc1068a26b9a0a68da3" May 11 20:51:35.310710 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:51:35.310688 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-77758f4558-kfkm4_openshift-console-operator(bffdd990-0f6a-4e43-a62a-94c91746d6fc)\"" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" podUID="bffdd990-0f6a-4e43-a62a-94c91746d6fc" May 11 20:51:35.704879 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:35.704841 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" event={"ID":"d15eaa12-fd27-44e5-a9dd-e75da067a870","Type":"ContainerStarted","Data":"25f80d35705c22221bc2624f95c0bb479d0cd596e94d04502197bca90e7cb433"} May 11 20:51:35.723880 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:35.723829 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-57cffc5786-pf68d" podStartSLOduration=2.600235086 podStartE2EDuration="12.723815173s" podCreationTimestamp="2026-05-11 20:51:23 +0000 UTC" firstStartedPulling="2026-05-11 20:51:24.497715699 +0000 UTC m=+67.795860394" lastFinishedPulling="2026-05-11 20:51:34.621295789 +0000 UTC m=+77.919440481" observedRunningTime="2026-05-11 20:51:35.722237181 +0000 UTC m=+79.020381903" watchObservedRunningTime="2026-05-11 20:51:35.723815173 +0000 UTC m=+79.021959948" May 11 20:51:40.460523 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.460371 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gcmjj"] May 11 20:51:40.465125 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.465104 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.470520 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.470500 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" May 11 20:51:40.471805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.471626 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-h27ck\"" May 11 20:51:40.471805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.471678 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" May 11 20:51:40.471805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.471704 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" May 11 20:51:40.471805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.471714 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" May 11 20:51:40.622539 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.622502 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmr2b\" (UniqueName: \"kubernetes.io/projected/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-kube-api-access-cmr2b\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.622539 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.622538 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-sys\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.622751 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.622559 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-root\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.622751 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.622577 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-tls\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.622751 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.622627 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-metrics-client-ca\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.622751 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.622652 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-textfile\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.622751 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.622669 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-wtmp\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.622751 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.622687 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.622751 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.622711 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-accelerators-collector-config\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724069 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.723965 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmr2b\" (UniqueName: \"kubernetes.io/projected/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-kube-api-access-cmr2b\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724069 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724000 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-sys\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724069 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724061 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-root\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724345 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724083 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-tls\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724345 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724151 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-metrics-client-ca\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724345 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724174 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-sys\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724345 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724189 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-root\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724345 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724225 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-textfile\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724345 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724249 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-wtmp\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724345 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724278 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724345 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724326 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-accelerators-collector-config\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724773 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724427 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-wtmp\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724773 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724724 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-textfile\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724848 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724786 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-metrics-client-ca\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.724895 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.724844 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-accelerators-collector-config\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.727149 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.727126 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-tls\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.727256 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.727234 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.735921 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.735897 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmr2b\" (UniqueName: \"kubernetes.io/projected/b98ddfda-a947-4d98-b77a-55a1bddfc8e4-kube-api-access-cmr2b\") pod \"node-exporter-gcmjj\" (UID: \"b98ddfda-a947-4d98-b77a-55a1bddfc8e4\") " pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.773978 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:40.773940 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gcmjj" May 11 20:51:40.782935 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:51:40.782908 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb98ddfda_a947_4d98_b77a_55a1bddfc8e4.slice/crio-136b4024beccee63aa021821ab3445049a7738cda0520c31b124d67657697f0a WatchSource:0}: Error finding container 136b4024beccee63aa021821ab3445049a7738cda0520c31b124d67657697f0a: Status 404 returned error can't find the container with id 136b4024beccee63aa021821ab3445049a7738cda0520c31b124d67657697f0a May 11 20:51:41.686796 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:41.686768 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-n8sxs" May 11 20:51:41.721908 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:41.721872 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gcmjj" event={"ID":"b98ddfda-a947-4d98-b77a-55a1bddfc8e4","Type":"ContainerStarted","Data":"bf47fcb4c5f6dee7405f6653437cf2434049495befccc3d82aea4d37789a0dad"} May 11 20:51:41.722092 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:41.721917 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gcmjj" event={"ID":"b98ddfda-a947-4d98-b77a-55a1bddfc8e4","Type":"ContainerStarted","Data":"136b4024beccee63aa021821ab3445049a7738cda0520c31b124d67657697f0a"} May 11 20:51:42.726069 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:42.726036 2562 generic.go:358] "Generic (PLEG): container finished" podID="b98ddfda-a947-4d98-b77a-55a1bddfc8e4" containerID="bf47fcb4c5f6dee7405f6653437cf2434049495befccc3d82aea4d37789a0dad" exitCode=0 May 11 20:51:42.726456 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:42.726081 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gcmjj" event={"ID":"b98ddfda-a947-4d98-b77a-55a1bddfc8e4","Type":"ContainerDied","Data":"bf47fcb4c5f6dee7405f6653437cf2434049495befccc3d82aea4d37789a0dad"} May 11 20:51:43.730453 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:43.730420 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gcmjj" event={"ID":"b98ddfda-a947-4d98-b77a-55a1bddfc8e4","Type":"ContainerStarted","Data":"8ff73be4edc5e952de7f2304ec6d311dcc464ea97b495880ffefd13ce43cf0af"} May 11 20:51:43.730453 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:43.730460 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gcmjj" event={"ID":"b98ddfda-a947-4d98-b77a-55a1bddfc8e4","Type":"ContainerStarted","Data":"cca7de74ce9cbcb04bb17f47f12025d30c3ec7657f339b6c005235f282f75b56"} May 11 20:51:43.752387 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:43.752334 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gcmjj" podStartSLOduration=2.916315092 podStartE2EDuration="3.752321371s" podCreationTimestamp="2026-05-11 20:51:40 +0000 UTC" firstStartedPulling="2026-05-11 20:51:40.784732711 +0000 UTC m=+84.082877400" lastFinishedPulling="2026-05-11 20:51:41.620738983 +0000 UTC m=+84.918883679" observedRunningTime="2026-05-11 20:51:43.750540812 +0000 UTC m=+87.048685514" watchObservedRunningTime="2026-05-11 20:51:43.752321371 +0000 UTC m=+87.050466082" May 11 20:51:45.648294 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:45.648268 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:51:45.829829 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:45.829795 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8bb9d58f7-w68mc"] May 11 20:51:47.311175 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:47.311145 2562 scope.go:117] "RemoveContainer" containerID="06bc74605e0fdaec6e282d807972f2829b5176a3e81d7dc1068a26b9a0a68da3" May 11 20:51:47.743788 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:47.743757 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 20:51:47.743960 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:47.743813 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" event={"ID":"bffdd990-0f6a-4e43-a62a-94c91746d6fc","Type":"ContainerStarted","Data":"73ac145fd4f1f0d6e425378058439775231bee2ad294aeb9dc25c282be28bc30"} May 11 20:51:47.744091 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:47.744071 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:51:47.761306 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:47.761244 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" podStartSLOduration=84.459912493 podStartE2EDuration="1m30.761226285s" podCreationTimestamp="2026-05-11 20:50:17 +0000 UTC" firstStartedPulling="2026-05-11 20:50:52.229225568 +0000 UTC m=+35.527370263" lastFinishedPulling="2026-05-11 20:50:58.530539352 +0000 UTC m=+41.828684055" observedRunningTime="2026-05-11 20:51:47.760854381 +0000 UTC m=+91.058999093" watchObservedRunningTime="2026-05-11 20:51:47.761226285 +0000 UTC m=+91.059371002" May 11 20:51:47.844603 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:47.844569 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-77758f4558-kfkm4" May 11 20:51:56.652506 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:51:56.652469 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xw5qw" May 11 20:52:04.797425 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:04.797341 2562 generic.go:358] "Generic (PLEG): container finished" podID="ba5697e4-10a6-472f-b7fc-5b4568baf16b" containerID="f7e6e09fcaca547b8dd31e161289ac1331b0599446a145d605783d9cfdad0933" exitCode=0 May 11 20:52:04.797781 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:04.797417 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" event={"ID":"ba5697e4-10a6-472f-b7fc-5b4568baf16b","Type":"ContainerDied","Data":"f7e6e09fcaca547b8dd31e161289ac1331b0599446a145d605783d9cfdad0933"} May 11 20:52:04.797781 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:04.797753 2562 scope.go:117] "RemoveContainer" containerID="f7e6e09fcaca547b8dd31e161289ac1331b0599446a145d605783d9cfdad0933" May 11 20:52:05.801933 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:05.801899 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-686cb587d-jshx7" event={"ID":"ba5697e4-10a6-472f-b7fc-5b4568baf16b","Type":"ContainerStarted","Data":"0540ccf38112772c1b563b2fd9f9512bd186a8d97ec914d85f5bc360ec0bdd8c"} May 11 20:52:10.852028 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:10.851958 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" podUID="54fd2ae3-b688-40a5-b542-77c19799ef8a" containerName="registry" containerID="cri-o://cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88" gracePeriod=30 May 11 20:52:11.096216 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.096195 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:52:11.114756 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.114689 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-trusted-ca\") pod \"54fd2ae3-b688-40a5-b542-77c19799ef8a\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " May 11 20:52:11.114756 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.114728 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") pod \"54fd2ae3-b688-40a5-b542-77c19799ef8a\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " May 11 20:52:11.114965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.114756 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-image-registry-private-configuration\") pod \"54fd2ae3-b688-40a5-b542-77c19799ef8a\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " May 11 20:52:11.114965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.114784 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-installation-pull-secrets\") pod \"54fd2ae3-b688-40a5-b542-77c19799ef8a\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " May 11 20:52:11.114965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.114813 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-bound-sa-token\") pod \"54fd2ae3-b688-40a5-b542-77c19799ef8a\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " May 11 20:52:11.114965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.114867 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k668b\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-kube-api-access-k668b\") pod \"54fd2ae3-b688-40a5-b542-77c19799ef8a\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " May 11 20:52:11.114965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.114893 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54fd2ae3-b688-40a5-b542-77c19799ef8a-ca-trust-extracted\") pod \"54fd2ae3-b688-40a5-b542-77c19799ef8a\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " May 11 20:52:11.114965 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.114930 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-certificates\") pod \"54fd2ae3-b688-40a5-b542-77c19799ef8a\" (UID: \"54fd2ae3-b688-40a5-b542-77c19799ef8a\") " May 11 20:52:11.115329 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.115180 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "54fd2ae3-b688-40a5-b542-77c19799ef8a" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:52:11.115504 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.115479 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "54fd2ae3-b688-40a5-b542-77c19799ef8a" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 11 20:52:11.118149 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.118102 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "54fd2ae3-b688-40a5-b542-77c19799ef8a" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:52:11.118799 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.118291 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-kube-api-access-k668b" (OuterVolumeSpecName: "kube-api-access-k668b") pod "54fd2ae3-b688-40a5-b542-77c19799ef8a" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a"). InnerVolumeSpecName "kube-api-access-k668b". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:52:11.118799 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.118374 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "54fd2ae3-b688-40a5-b542-77c19799ef8a" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:52:11.119178 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.119155 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "54fd2ae3-b688-40a5-b542-77c19799ef8a" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:52:11.123275 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.123136 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "54fd2ae3-b688-40a5-b542-77c19799ef8a" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 20:52:11.127159 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.127131 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54fd2ae3-b688-40a5-b542-77c19799ef8a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "54fd2ae3-b688-40a5-b542-77c19799ef8a" (UID: "54fd2ae3-b688-40a5-b542-77c19799ef8a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" May 11 20:52:11.215686 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.215659 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k668b\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-kube-api-access-k668b\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:52:11.215686 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.215684 2562 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54fd2ae3-b688-40a5-b542-77c19799ef8a-ca-trust-extracted\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:52:11.215812 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.215693 2562 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-certificates\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:52:11.215812 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.215703 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54fd2ae3-b688-40a5-b542-77c19799ef8a-trusted-ca\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:52:11.215812 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.215712 2562 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-registry-tls\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:52:11.215812 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.215721 2562 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-image-registry-private-configuration\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:52:11.215812 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.215731 2562 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54fd2ae3-b688-40a5-b542-77c19799ef8a-installation-pull-secrets\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:52:11.215812 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.215740 2562 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54fd2ae3-b688-40a5-b542-77c19799ef8a-bound-sa-token\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:52:11.818908 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.818875 2562 generic.go:358] "Generic (PLEG): container finished" podID="54fd2ae3-b688-40a5-b542-77c19799ef8a" containerID="cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88" exitCode=0 May 11 20:52:11.819120 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.818969 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" event={"ID":"54fd2ae3-b688-40a5-b542-77c19799ef8a","Type":"ContainerDied","Data":"cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88"} May 11 20:52:11.819120 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.819028 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" May 11 20:52:11.819120 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.819039 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8bb9d58f7-w68mc" event={"ID":"54fd2ae3-b688-40a5-b542-77c19799ef8a","Type":"ContainerDied","Data":"590938bc2153fafe07617a7d6c2292d409dc36d7688e6186f8984540b1843e4e"} May 11 20:52:11.819120 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.819059 2562 scope.go:117] "RemoveContainer" containerID="cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88" May 11 20:52:11.827971 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.827947 2562 scope.go:117] "RemoveContainer" containerID="cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88" May 11 20:52:11.831132 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:52:11.831090 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88\": container with ID starting with cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88 not found: ID does not exist" containerID="cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88" May 11 20:52:11.831239 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.831136 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88"} err="failed to get container status \"cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88\": rpc error: code = NotFound desc = could not find container \"cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88\": container with ID starting with cec87486bb65b9b5ffb18f6665d2b9585f48fc77ac3d04aef64e242a7fe66c88 not found: ID does not exist" May 11 20:52:11.837606 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.837570 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8bb9d58f7-w68mc"] May 11 20:52:11.839595 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:11.839574 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8bb9d58f7-w68mc"] May 11 20:52:13.313377 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:13.313341 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54fd2ae3-b688-40a5-b542-77c19799ef8a" path="/var/lib/kubelet/pods/54fd2ae3-b688-40a5-b542-77c19799ef8a/volumes" May 11 20:52:25.857516 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:25.857479 2562 generic.go:358] "Generic (PLEG): container finished" podID="d7e41df8-69e8-4481-9aa4-0456bce8d7df" containerID="bcd15c89f6873c3d0d683c6322ad9238722065d7e6670e6ad9a4eba65279e0d5" exitCode=0 May 11 20:52:25.857875 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:25.857552 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" event={"ID":"d7e41df8-69e8-4481-9aa4-0456bce8d7df","Type":"ContainerDied","Data":"bcd15c89f6873c3d0d683c6322ad9238722065d7e6670e6ad9a4eba65279e0d5"} May 11 20:52:25.857875 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:25.857828 2562 scope.go:117] "RemoveContainer" containerID="bcd15c89f6873c3d0d683c6322ad9238722065d7e6670e6ad9a4eba65279e0d5" May 11 20:52:26.861581 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:26.861544 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-649b864788-qhtlj" event={"ID":"d7e41df8-69e8-4481-9aa4-0456bce8d7df","Type":"ContainerStarted","Data":"da63b28fe6fb50f78608fd52ce681fa68167dd57f5722bc5739d4c914e82092a"} May 11 20:52:29.870097 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:29.870066 2562 generic.go:358] "Generic (PLEG): container finished" podID="0311189e-d497-4d2c-a742-ad52f624750a" containerID="f8887b3666fe8266de153830d96f8133100cdbd8a5355dc938e53a65cf4a0f1b" exitCode=0 May 11 20:52:29.870451 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:29.870138 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-544c98cc96-j75m5" event={"ID":"0311189e-d497-4d2c-a742-ad52f624750a","Type":"ContainerDied","Data":"f8887b3666fe8266de153830d96f8133100cdbd8a5355dc938e53a65cf4a0f1b"} May 11 20:52:29.870451 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:29.870424 2562 scope.go:117] "RemoveContainer" containerID="f8887b3666fe8266de153830d96f8133100cdbd8a5355dc938e53a65cf4a0f1b" May 11 20:52:30.874689 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:52:30.874652 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-544c98cc96-j75m5" event={"ID":"0311189e-d497-4d2c-a742-ad52f624750a","Type":"ContainerStarted","Data":"97e2a3805bf022c262fe3921655fc5a1b620b063479406cc5f1e258890c27de7"} May 11 20:55:17.222191 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:55:17.222155 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 20:55:17.222854 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:55:17.222834 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 20:55:17.226643 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:55:17.226623 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 20:55:17.227553 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:55:17.227534 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 20:55:17.231938 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:55:17.231920 2562 kubelet.go:1628] "Image garbage collection succeeded" May 11 20:56:18.420701 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.420609 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj"] May 11 20:56:18.421226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.421077 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54fd2ae3-b688-40a5-b542-77c19799ef8a" containerName="registry" May 11 20:56:18.421226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.421095 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="54fd2ae3-b688-40a5-b542-77c19799ef8a" containerName="registry" May 11 20:56:18.421226 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.421172 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="54fd2ae3-b688-40a5-b542-77c19799ef8a" containerName="registry" May 11 20:56:18.423973 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.423952 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.426377 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.426355 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" May 11 20:56:18.426513 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.426480 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" May 11 20:56:18.426583 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.426513 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" May 11 20:56:18.426583 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.426575 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9zd6v\"" May 11 20:56:18.426694 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.426603 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" May 11 20:56:18.444389 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.444363 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj"] May 11 20:56:18.597664 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.597633 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djtg8\" (UniqueName: \"kubernetes.io/projected/4fe0c3de-20cd-4ad3-83a8-876b1eebe765-kube-api-access-djtg8\") pod \"opendatahub-operator-controller-manager-755c95f69f-vwxsj\" (UID: \"4fe0c3de-20cd-4ad3-83a8-876b1eebe765\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.597825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.597699 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4fe0c3de-20cd-4ad3-83a8-876b1eebe765-apiservice-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-vwxsj\" (UID: \"4fe0c3de-20cd-4ad3-83a8-876b1eebe765\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.597825 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.597726 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4fe0c3de-20cd-4ad3-83a8-876b1eebe765-webhook-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-vwxsj\" (UID: \"4fe0c3de-20cd-4ad3-83a8-876b1eebe765\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.698355 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.698274 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djtg8\" (UniqueName: \"kubernetes.io/projected/4fe0c3de-20cd-4ad3-83a8-876b1eebe765-kube-api-access-djtg8\") pod \"opendatahub-operator-controller-manager-755c95f69f-vwxsj\" (UID: \"4fe0c3de-20cd-4ad3-83a8-876b1eebe765\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.698500 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.698354 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4fe0c3de-20cd-4ad3-83a8-876b1eebe765-apiservice-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-vwxsj\" (UID: \"4fe0c3de-20cd-4ad3-83a8-876b1eebe765\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.698500 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.698393 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4fe0c3de-20cd-4ad3-83a8-876b1eebe765-webhook-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-vwxsj\" (UID: \"4fe0c3de-20cd-4ad3-83a8-876b1eebe765\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.700757 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.700732 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4fe0c3de-20cd-4ad3-83a8-876b1eebe765-apiservice-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-vwxsj\" (UID: \"4fe0c3de-20cd-4ad3-83a8-876b1eebe765\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.700862 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.700759 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4fe0c3de-20cd-4ad3-83a8-876b1eebe765-webhook-cert\") pod \"opendatahub-operator-controller-manager-755c95f69f-vwxsj\" (UID: \"4fe0c3de-20cd-4ad3-83a8-876b1eebe765\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.706973 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.706950 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djtg8\" (UniqueName: \"kubernetes.io/projected/4fe0c3de-20cd-4ad3-83a8-876b1eebe765-kube-api-access-djtg8\") pod \"opendatahub-operator-controller-manager-755c95f69f-vwxsj\" (UID: \"4fe0c3de-20cd-4ad3-83a8-876b1eebe765\") " pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.734759 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.734733 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:18.859320 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.859294 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj"] May 11 20:56:18.861992 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:56:18.861964 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fe0c3de_20cd_4ad3_83a8_876b1eebe765.slice/crio-8d80d4ed3c6151d0f5747c17442fa2eb9e2124d2f441b089b34de7cdf4cf3bc5 WatchSource:0}: Error finding container 8d80d4ed3c6151d0f5747c17442fa2eb9e2124d2f441b089b34de7cdf4cf3bc5: Status 404 returned error can't find the container with id 8d80d4ed3c6151d0f5747c17442fa2eb9e2124d2f441b089b34de7cdf4cf3bc5 May 11 20:56:18.863507 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:18.863488 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 20:56:19.490861 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:19.490827 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" event={"ID":"4fe0c3de-20cd-4ad3-83a8-876b1eebe765","Type":"ContainerStarted","Data":"8d80d4ed3c6151d0f5747c17442fa2eb9e2124d2f441b089b34de7cdf4cf3bc5"} May 11 20:56:21.501954 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:21.501922 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" event={"ID":"4fe0c3de-20cd-4ad3-83a8-876b1eebe765","Type":"ContainerStarted","Data":"07384d24f10bd26710104398b72500031fa695469aeac69821f470bec0789cad"} May 11 20:56:21.502425 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:21.502069 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:21.527963 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:21.527912 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" podStartSLOduration=1.06758675 podStartE2EDuration="3.527893795s" podCreationTimestamp="2026-05-11 20:56:18 +0000 UTC" firstStartedPulling="2026-05-11 20:56:18.863642195 +0000 UTC m=+362.161786886" lastFinishedPulling="2026-05-11 20:56:21.323949242 +0000 UTC m=+364.622093931" observedRunningTime="2026-05-11 20:56:21.525864062 +0000 UTC m=+364.824008773" watchObservedRunningTime="2026-05-11 20:56:21.527893795 +0000 UTC m=+364.826038507" May 11 20:56:23.729875 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.729844 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2"] May 11 20:56:23.733083 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.733067 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.736864 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.736838 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" May 11 20:56:23.736864 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.736840 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-6k2xh\"" May 11 20:56:23.737090 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.736876 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" May 11 20:56:23.737090 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.736862 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" May 11 20:56:23.737090 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.736915 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" May 11 20:56:23.737090 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.736966 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" May 11 20:56:23.751756 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.751733 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2"] May 11 20:56:23.842251 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.842217 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d48888a-4be7-4470-b095-efad539e3b56-cert\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.842251 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.842256 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d48888a-4be7-4470-b095-efad539e3b56-metrics-cert\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.842446 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.842287 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lj8g\" (UniqueName: \"kubernetes.io/projected/6d48888a-4be7-4470-b095-efad539e3b56-kube-api-access-4lj8g\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.842446 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.842356 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6d48888a-4be7-4470-b095-efad539e3b56-manager-config\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.943620 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.943587 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d48888a-4be7-4470-b095-efad539e3b56-metrics-cert\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.943754 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.943633 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lj8g\" (UniqueName: \"kubernetes.io/projected/6d48888a-4be7-4470-b095-efad539e3b56-kube-api-access-4lj8g\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.943754 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.943663 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6d48888a-4be7-4470-b095-efad539e3b56-manager-config\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.943754 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.943728 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d48888a-4be7-4470-b095-efad539e3b56-cert\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.944457 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.944430 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6d48888a-4be7-4470-b095-efad539e3b56-manager-config\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.946381 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.946357 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d48888a-4be7-4470-b095-efad539e3b56-cert\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.946751 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.946731 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d48888a-4be7-4470-b095-efad539e3b56-metrics-cert\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:23.952996 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:23.952974 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lj8g\" (UniqueName: \"kubernetes.io/projected/6d48888a-4be7-4470-b095-efad539e3b56-kube-api-access-4lj8g\") pod \"lws-controller-manager-68d9b68cf6-vdhg2\" (UID: \"6d48888a-4be7-4470-b095-efad539e3b56\") " pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:24.041870 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:24.041778 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:24.194794 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:24.194769 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2"] May 11 20:56:24.196597 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:56:24.196571 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d48888a_4be7_4470_b095_efad539e3b56.slice/crio-e901c00a807970fd5cee7725be93ebf9861c7ae9679971c1b5761d75992c4467 WatchSource:0}: Error finding container e901c00a807970fd5cee7725be93ebf9861c7ae9679971c1b5761d75992c4467: Status 404 returned error can't find the container with id e901c00a807970fd5cee7725be93ebf9861c7ae9679971c1b5761d75992c4467 May 11 20:56:24.511303 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:24.511257 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" event={"ID":"6d48888a-4be7-4470-b095-efad539e3b56","Type":"ContainerStarted","Data":"e901c00a807970fd5cee7725be93ebf9861c7ae9679971c1b5761d75992c4467"} May 11 20:56:27.524317 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:27.524274 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" event={"ID":"6d48888a-4be7-4470-b095-efad539e3b56","Type":"ContainerStarted","Data":"a1afd67344fcf7f5dd0a9db3b5deb74399e1a183464ade7df802d087879a532a"} May 11 20:56:27.524873 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:27.524392 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:27.540102 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:27.540059 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" podStartSLOduration=1.881333347 podStartE2EDuration="4.540045508s" podCreationTimestamp="2026-05-11 20:56:23 +0000 UTC" firstStartedPulling="2026-05-11 20:56:24.198627517 +0000 UTC m=+367.496772219" lastFinishedPulling="2026-05-11 20:56:26.857339687 +0000 UTC m=+370.155484380" observedRunningTime="2026-05-11 20:56:27.538933914 +0000 UTC m=+370.837078662" watchObservedRunningTime="2026-05-11 20:56:27.540045508 +0000 UTC m=+370.838190219" May 11 20:56:32.507112 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:32.507072 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-755c95f69f-vwxsj" May 11 20:56:38.529676 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:38.529640 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-68d9b68cf6-vdhg2" May 11 20:56:57.826219 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.826178 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z"] May 11 20:56:57.837205 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.837181 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:57.840061 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.840036 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-gw-ca-root-cert\"" May 11 20:56:57.840327 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.840310 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-b8g4l\"" May 11 20:56:57.841802 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.841778 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z"] May 11 20:56:57.911955 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.911920 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:57.912149 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.911959 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:57.912149 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.912098 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/77e76c93-c30b-4154-b356-fe63f4d57502-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:57.912149 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.912139 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:57.912300 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.912177 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/77e76c93-c30b-4154-b356-fe63f4d57502-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:57.912300 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.912195 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsbsh\" (UniqueName: \"kubernetes.io/projected/77e76c93-c30b-4154-b356-fe63f4d57502-kube-api-access-bsbsh\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:57.912300 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.912216 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:57.912300 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.912239 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/77e76c93-c30b-4154-b356-fe63f4d57502-istio-token\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:57.912424 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:57.912307 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-istio-data\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013036 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.012991 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-istio-data\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013297 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013049 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013297 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013069 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013297 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013104 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/77e76c93-c30b-4154-b356-fe63f4d57502-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013297 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013129 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013297 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013163 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/77e76c93-c30b-4154-b356-fe63f4d57502-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013297 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013192 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsbsh\" (UniqueName: \"kubernetes.io/projected/77e76c93-c30b-4154-b356-fe63f4d57502-kube-api-access-bsbsh\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013297 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013223 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013668 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013497 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013668 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013542 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013668 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013653 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-istio-data\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013668 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013661 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/77e76c93-c30b-4154-b356-fe63f4d57502-istio-token\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013874 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013840 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.013939 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.013917 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/77e76c93-c30b-4154-b356-fe63f4d57502-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.015628 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.015603 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/77e76c93-c30b-4154-b356-fe63f4d57502-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.015716 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.015697 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/77e76c93-c30b-4154-b356-fe63f4d57502-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.021702 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.021678 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/77e76c93-c30b-4154-b356-fe63f4d57502-istio-token\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.021798 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.021703 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsbsh\" (UniqueName: \"kubernetes.io/projected/77e76c93-c30b-4154-b356-fe63f4d57502-kube-api-access-bsbsh\") pod \"data-science-gateway-data-science-gateway-class-595b7776f85sn8z\" (UID: \"77e76c93-c30b-4154-b356-fe63f4d57502\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.148188 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.148112 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:56:58.276155 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.275966 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z"] May 11 20:56:58.278586 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:56:58.278553 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77e76c93_c30b_4154_b356_fe63f4d57502.slice/crio-dc9e087753454328760a3271801966807fe4d7c0a3be640931c84bdb5b40f70e WatchSource:0}: Error finding container dc9e087753454328760a3271801966807fe4d7c0a3be640931c84bdb5b40f70e: Status 404 returned error can't find the container with id dc9e087753454328760a3271801966807fe4d7c0a3be640931c84bdb5b40f70e May 11 20:56:58.619207 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:56:58.619164 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" event={"ID":"77e76c93-c30b-4154-b356-fe63f4d57502","Type":"ContainerStarted","Data":"dc9e087753454328760a3271801966807fe4d7c0a3be640931c84bdb5b40f70e"} May 11 20:57:00.972315 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:00.972263 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} May 11 20:57:00.972627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:00.972336 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} May 11 20:57:00.972627 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:00.972362 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} May 11 20:57:01.630738 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:01.630705 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" event={"ID":"77e76c93-c30b-4154-b356-fe63f4d57502","Type":"ContainerStarted","Data":"5347fe4277a817b7ab17238eac70f1bff89d6cea3e319c8b588c3b4b908357ef"} May 11 20:57:01.653272 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:01.653223 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" podStartSLOduration=1.9618366040000002 podStartE2EDuration="4.653207932s" podCreationTimestamp="2026-05-11 20:56:57 +0000 UTC" firstStartedPulling="2026-05-11 20:56:58.280651757 +0000 UTC m=+401.578796449" lastFinishedPulling="2026-05-11 20:57:00.972023077 +0000 UTC m=+404.270167777" observedRunningTime="2026-05-11 20:57:01.650777773 +0000 UTC m=+404.948922494" watchObservedRunningTime="2026-05-11 20:57:01.653207932 +0000 UTC m=+404.951352642" May 11 20:57:02.148463 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:02.148427 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:57:02.153162 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:02.153137 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:57:02.633537 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:02.633500 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:57:02.634538 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:02.634518 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-595b7776f85sn8z" May 11 20:57:35.972045 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:35.971993 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z4wzm"] May 11 20:57:35.982301 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:35.982269 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z4wzm"] May 11 20:57:35.982458 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:35.982421 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" May 11 20:57:35.984997 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:35.984972 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" May 11 20:57:35.985988 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:35.985967 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" May 11 20:57:35.986093 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:35.985975 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-b8n89\"" May 11 20:57:36.023198 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:36.023164 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9c2x\" (UniqueName: \"kubernetes.io/projected/9ee2b687-0bed-4f8e-be7b-5f403af9a754-kube-api-access-c9c2x\") pod \"kuadrant-operator-catalog-z4wzm\" (UID: \"9ee2b687-0bed-4f8e-be7b-5f403af9a754\") " pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" May 11 20:57:36.123604 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:36.123562 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9c2x\" (UniqueName: \"kubernetes.io/projected/9ee2b687-0bed-4f8e-be7b-5f403af9a754-kube-api-access-c9c2x\") pod \"kuadrant-operator-catalog-z4wzm\" (UID: \"9ee2b687-0bed-4f8e-be7b-5f403af9a754\") " pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" May 11 20:57:36.131863 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:36.131836 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9c2x\" (UniqueName: \"kubernetes.io/projected/9ee2b687-0bed-4f8e-be7b-5f403af9a754-kube-api-access-c9c2x\") pod \"kuadrant-operator-catalog-z4wzm\" (UID: \"9ee2b687-0bed-4f8e-be7b-5f403af9a754\") " pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" May 11 20:57:36.294425 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:36.294334 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" May 11 20:57:36.337706 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:36.337672 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z4wzm"] May 11 20:57:36.423340 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:36.423307 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z4wzm"] May 11 20:57:36.426963 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:57:36.426936 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee2b687_0bed_4f8e_be7b_5f403af9a754.slice/crio-998a2ec4ef2232e8db7995e6ca65233c04b7885a0863797b58a6b9cc2553ce6a WatchSource:0}: Error finding container 998a2ec4ef2232e8db7995e6ca65233c04b7885a0863797b58a6b9cc2553ce6a: Status 404 returned error can't find the container with id 998a2ec4ef2232e8db7995e6ca65233c04b7885a0863797b58a6b9cc2553ce6a May 11 20:57:36.736679 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:36.736644 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" event={"ID":"9ee2b687-0bed-4f8e-be7b-5f403af9a754","Type":"ContainerStarted","Data":"998a2ec4ef2232e8db7995e6ca65233c04b7885a0863797b58a6b9cc2553ce6a"} May 11 20:57:38.744688 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:38.744605 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" event={"ID":"9ee2b687-0bed-4f8e-be7b-5f403af9a754","Type":"ContainerStarted","Data":"d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97"} May 11 20:57:38.745137 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:38.744718 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" podUID="9ee2b687-0bed-4f8e-be7b-5f403af9a754" containerName="registry-server" containerID="cri-o://d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97" gracePeriod=2 May 11 20:57:38.761536 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:38.761487 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" podStartSLOduration=1.714477745 podStartE2EDuration="3.761471589s" podCreationTimestamp="2026-05-11 20:57:35 +0000 UTC" firstStartedPulling="2026-05-11 20:57:36.428674077 +0000 UTC m=+439.726818767" lastFinishedPulling="2026-05-11 20:57:38.475667919 +0000 UTC m=+441.773812611" observedRunningTime="2026-05-11 20:57:38.75969989 +0000 UTC m=+442.057844602" watchObservedRunningTime="2026-05-11 20:57:38.761471589 +0000 UTC m=+442.059616300" May 11 20:57:38.980306 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:38.980275 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" May 11 20:57:39.045455 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.045373 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9c2x\" (UniqueName: \"kubernetes.io/projected/9ee2b687-0bed-4f8e-be7b-5f403af9a754-kube-api-access-c9c2x\") pod \"9ee2b687-0bed-4f8e-be7b-5f403af9a754\" (UID: \"9ee2b687-0bed-4f8e-be7b-5f403af9a754\") " May 11 20:57:39.047622 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.047588 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee2b687-0bed-4f8e-be7b-5f403af9a754-kube-api-access-c9c2x" (OuterVolumeSpecName: "kube-api-access-c9c2x") pod "9ee2b687-0bed-4f8e-be7b-5f403af9a754" (UID: "9ee2b687-0bed-4f8e-be7b-5f403af9a754"). InnerVolumeSpecName "kube-api-access-c9c2x". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:57:39.146304 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.146256 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c9c2x\" (UniqueName: \"kubernetes.io/projected/9ee2b687-0bed-4f8e-be7b-5f403af9a754-kube-api-access-c9c2x\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:57:39.748924 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.748889 2562 generic.go:358] "Generic (PLEG): container finished" podID="9ee2b687-0bed-4f8e-be7b-5f403af9a754" containerID="d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97" exitCode=0 May 11 20:57:39.749384 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.748938 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" event={"ID":"9ee2b687-0bed-4f8e-be7b-5f403af9a754","Type":"ContainerDied","Data":"d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97"} May 11 20:57:39.749384 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.748951 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" May 11 20:57:39.749384 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.748970 2562 scope.go:117] "RemoveContainer" containerID="d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97" May 11 20:57:39.749384 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.748960 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-z4wzm" event={"ID":"9ee2b687-0bed-4f8e-be7b-5f403af9a754","Type":"ContainerDied","Data":"998a2ec4ef2232e8db7995e6ca65233c04b7885a0863797b58a6b9cc2553ce6a"} May 11 20:57:39.757244 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.757225 2562 scope.go:117] "RemoveContainer" containerID="d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97" May 11 20:57:39.757475 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:57:39.757453 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97\": container with ID starting with d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97 not found: ID does not exist" containerID="d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97" May 11 20:57:39.757519 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.757483 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97"} err="failed to get container status \"d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97\": rpc error: code = NotFound desc = could not find container \"d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97\": container with ID starting with d2b65909b21205241fd7b17ad4f680e1275844431a7aa9c9e6270c3cb8942e97 not found: ID does not exist" May 11 20:57:39.763903 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.763873 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z4wzm"] May 11 20:57:39.767765 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:39.767744 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-z4wzm"] May 11 20:57:41.313883 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:41.313845 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee2b687-0bed-4f8e-be7b-5f403af9a754" path="/var/lib/kubelet/pods/9ee2b687-0bed-4f8e-be7b-5f403af9a754/volumes" May 11 20:57:58.866759 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:58.866725 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27"] May 11 20:57:58.867157 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:58.867071 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ee2b687-0bed-4f8e-be7b-5f403af9a754" containerName="registry-server" May 11 20:57:58.867157 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:58.867085 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee2b687-0bed-4f8e-be7b-5f403af9a754" containerName="registry-server" May 11 20:57:58.867157 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:58.867147 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ee2b687-0bed-4f8e-be7b-5f403af9a754" containerName="registry-server" May 11 20:57:58.871169 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:58.871151 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27" May 11 20:57:58.874060 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:58.874038 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-jgkpj\"" May 11 20:57:58.874182 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:58.874164 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" May 11 20:57:58.874736 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:58.874720 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" May 11 20:57:58.874793 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:58.874776 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" May 11 20:57:58.884191 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:58.884169 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27"] May 11 20:57:59.012552 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:59.012511 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96nfc\" (UniqueName: \"kubernetes.io/projected/97722e29-4ce2-433e-8f05-24777f7757e9-kube-api-access-96nfc\") pod \"dns-operator-controller-manager-648d5c98bc-xvb27\" (UID: \"97722e29-4ce2-433e-8f05-24777f7757e9\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27" May 11 20:57:59.113919 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:59.113883 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96nfc\" (UniqueName: \"kubernetes.io/projected/97722e29-4ce2-433e-8f05-24777f7757e9-kube-api-access-96nfc\") pod \"dns-operator-controller-manager-648d5c98bc-xvb27\" (UID: \"97722e29-4ce2-433e-8f05-24777f7757e9\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27" May 11 20:57:59.129057 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:59.128993 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96nfc\" (UniqueName: \"kubernetes.io/projected/97722e29-4ce2-433e-8f05-24777f7757e9-kube-api-access-96nfc\") pod \"dns-operator-controller-manager-648d5c98bc-xvb27\" (UID: \"97722e29-4ce2-433e-8f05-24777f7757e9\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27" May 11 20:57:59.180783 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:59.180757 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27" May 11 20:57:59.315366 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:59.315331 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27"] May 11 20:57:59.319252 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:57:59.319227 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97722e29_4ce2_433e_8f05_24777f7757e9.slice/crio-ca8e037e295763bc023962c4a72a74d96a0494ae65b3fe485176d3ad064f850f WatchSource:0}: Error finding container ca8e037e295763bc023962c4a72a74d96a0494ae65b3fe485176d3ad064f850f: Status 404 returned error can't find the container with id ca8e037e295763bc023962c4a72a74d96a0494ae65b3fe485176d3ad064f850f May 11 20:57:59.813251 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:57:59.813218 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27" event={"ID":"97722e29-4ce2-433e-8f05-24777f7757e9","Type":"ContainerStarted","Data":"ca8e037e295763bc023962c4a72a74d96a0494ae65b3fe485176d3ad064f850f"} May 11 20:58:01.820382 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:01.820343 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27" event={"ID":"97722e29-4ce2-433e-8f05-24777f7757e9","Type":"ContainerStarted","Data":"fe900043409dec788f2da7f8618afca8bf64382aeb8969bef6db8e4d60c4882c"} May 11 20:58:01.820812 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:01.820465 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27" May 11 20:58:01.837978 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:01.837877 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27" podStartSLOduration=1.5846623050000002 podStartE2EDuration="3.837864545s" podCreationTimestamp="2026-05-11 20:57:58 +0000 UTC" firstStartedPulling="2026-05-11 20:57:59.321197575 +0000 UTC m=+462.619342264" lastFinishedPulling="2026-05-11 20:58:01.574399814 +0000 UTC m=+464.872544504" observedRunningTime="2026-05-11 20:58:01.83726351 +0000 UTC m=+465.135408257" watchObservedRunningTime="2026-05-11 20:58:01.837864545 +0000 UTC m=+465.136009255" May 11 20:58:06.593985 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:06.593949 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7"] May 11 20:58:06.597182 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:06.597164 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" May 11 20:58:06.600264 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:06.600244 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-gbp5k\"" May 11 20:58:06.624227 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:06.624202 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7"] May 11 20:58:06.681172 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:06.681138 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t78g2\" (UniqueName: \"kubernetes.io/projected/4ec695a5-cdc5-4b65-93dd-42acd4e1939b-kube-api-access-t78g2\") pod \"limitador-operator-controller-manager-85c4996f8c-8h8t7\" (UID: \"4ec695a5-cdc5-4b65-93dd-42acd4e1939b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" May 11 20:58:06.782684 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:06.782652 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t78g2\" (UniqueName: \"kubernetes.io/projected/4ec695a5-cdc5-4b65-93dd-42acd4e1939b-kube-api-access-t78g2\") pod \"limitador-operator-controller-manager-85c4996f8c-8h8t7\" (UID: \"4ec695a5-cdc5-4b65-93dd-42acd4e1939b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" May 11 20:58:06.797281 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:06.797256 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t78g2\" (UniqueName: \"kubernetes.io/projected/4ec695a5-cdc5-4b65-93dd-42acd4e1939b-kube-api-access-t78g2\") pod \"limitador-operator-controller-manager-85c4996f8c-8h8t7\" (UID: \"4ec695a5-cdc5-4b65-93dd-42acd4e1939b\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" May 11 20:58:06.906881 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:06.906798 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" May 11 20:58:07.042957 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:07.042933 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7"] May 11 20:58:07.045874 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:58:07.045850 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec695a5_cdc5_4b65_93dd_42acd4e1939b.slice/crio-3b4c22e0a47db227bfb4c1edf8a0c2e9825b709ef4493aeec247b8496e1c38ba WatchSource:0}: Error finding container 3b4c22e0a47db227bfb4c1edf8a0c2e9825b709ef4493aeec247b8496e1c38ba: Status 404 returned error can't find the container with id 3b4c22e0a47db227bfb4c1edf8a0c2e9825b709ef4493aeec247b8496e1c38ba May 11 20:58:07.843951 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:07.843910 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" event={"ID":"4ec695a5-cdc5-4b65-93dd-42acd4e1939b","Type":"ContainerStarted","Data":"3b4c22e0a47db227bfb4c1edf8a0c2e9825b709ef4493aeec247b8496e1c38ba"} May 11 20:58:08.848025 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:08.847982 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" event={"ID":"4ec695a5-cdc5-4b65-93dd-42acd4e1939b","Type":"ContainerStarted","Data":"f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52"} May 11 20:58:08.848394 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:08.848137 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" May 11 20:58:08.863935 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:08.863892 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" podStartSLOduration=1.500404905 podStartE2EDuration="2.863880559s" podCreationTimestamp="2026-05-11 20:58:06 +0000 UTC" firstStartedPulling="2026-05-11 20:58:07.047769412 +0000 UTC m=+470.345914101" lastFinishedPulling="2026-05-11 20:58:08.411245063 +0000 UTC m=+471.709389755" observedRunningTime="2026-05-11 20:58:08.863134601 +0000 UTC m=+472.161279314" watchObservedRunningTime="2026-05-11 20:58:08.863880559 +0000 UTC m=+472.162025269" May 11 20:58:12.826320 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:12.826284 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-xvb27" May 11 20:58:19.370385 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.370349 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc"] May 11 20:58:19.376558 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.376540 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.378879 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.378856 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" May 11 20:58:19.379043 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.379028 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8ksxs\"" May 11 20:58:19.379088 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.379069 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" May 11 20:58:19.385760 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.385739 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc"] May 11 20:58:19.489985 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.489949 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b54a001-6e04-4358-ab53-53a29a425298-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-6fqwc\" (UID: \"2b54a001-6e04-4358-ab53-53a29a425298\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.490167 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.489999 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2b54a001-6e04-4358-ab53-53a29a425298-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-6fqwc\" (UID: \"2b54a001-6e04-4358-ab53-53a29a425298\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.490167 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.490085 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24qkt\" (UniqueName: \"kubernetes.io/projected/2b54a001-6e04-4358-ab53-53a29a425298-kube-api-access-24qkt\") pod \"kuadrant-console-plugin-6cb54b5c86-6fqwc\" (UID: \"2b54a001-6e04-4358-ab53-53a29a425298\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.590998 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.590960 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24qkt\" (UniqueName: \"kubernetes.io/projected/2b54a001-6e04-4358-ab53-53a29a425298-kube-api-access-24qkt\") pod \"kuadrant-console-plugin-6cb54b5c86-6fqwc\" (UID: \"2b54a001-6e04-4358-ab53-53a29a425298\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.591211 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.591068 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b54a001-6e04-4358-ab53-53a29a425298-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-6fqwc\" (UID: \"2b54a001-6e04-4358-ab53-53a29a425298\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.591211 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.591112 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2b54a001-6e04-4358-ab53-53a29a425298-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-6fqwc\" (UID: \"2b54a001-6e04-4358-ab53-53a29a425298\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.591736 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.591710 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2b54a001-6e04-4358-ab53-53a29a425298-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-6fqwc\" (UID: \"2b54a001-6e04-4358-ab53-53a29a425298\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.593528 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.593500 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b54a001-6e04-4358-ab53-53a29a425298-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-6fqwc\" (UID: \"2b54a001-6e04-4358-ab53-53a29a425298\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.602541 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.602516 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24qkt\" (UniqueName: \"kubernetes.io/projected/2b54a001-6e04-4358-ab53-53a29a425298-kube-api-access-24qkt\") pod \"kuadrant-console-plugin-6cb54b5c86-6fqwc\" (UID: \"2b54a001-6e04-4358-ab53-53a29a425298\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.686027 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.685934 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" May 11 20:58:19.813738 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.813704 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc"] May 11 20:58:19.817022 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:58:19.816976 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b54a001_6e04_4358_ab53_53a29a425298.slice/crio-e774f6ef80684926560f569ed6c8a14ecb9bcc0d5b877c473dd8ad7b36b59375 WatchSource:0}: Error finding container e774f6ef80684926560f569ed6c8a14ecb9bcc0d5b877c473dd8ad7b36b59375: Status 404 returned error can't find the container with id e774f6ef80684926560f569ed6c8a14ecb9bcc0d5b877c473dd8ad7b36b59375 May 11 20:58:19.856452 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.856418 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" May 11 20:58:19.887851 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:19.887812 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" event={"ID":"2b54a001-6e04-4358-ab53-53a29a425298","Type":"ContainerStarted","Data":"e774f6ef80684926560f569ed6c8a14ecb9bcc0d5b877c473dd8ad7b36b59375"} May 11 20:58:29.844388 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.844286 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7"] May 11 20:58:29.844846 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.844597 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" containerName="manager" containerID="cri-o://f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52" gracePeriod=2 May 11 20:58:29.852122 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.852065 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7"] May 11 20:58:29.854797 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.854731 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" containerName="manager" probeResult="failure" output="Get \"http://10.132.0.26:8081/readyz\": dial tcp 10.132.0.26:8081: connect: connection refused" May 11 20:58:29.871671 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.871640 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd"] May 11 20:58:29.872065 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.872040 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" containerName="manager" May 11 20:58:29.872065 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.872062 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" containerName="manager" May 11 20:58:29.872233 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.872139 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" containerName="manager" May 11 20:58:29.875266 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.875243 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd" May 11 20:58:29.878062 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.877935 2562 status_manager.go:895] "Failed to get status for pod" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" err="pods \"limitador-operator-controller-manager-85c4996f8c-8h8t7\" is forbidden: User \"system:node:ip-10-0-135-190.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-190.ec2.internal' and this object" May 11 20:58:29.878349 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.878326 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvcdv\" (UniqueName: \"kubernetes.io/projected/c3f98026-7cff-4e8b-8b1d-26cacd2d603d-kube-api-access-kvcdv\") pod \"limitador-operator-controller-manager-85c4996f8c-xdtdd\" (UID: \"c3f98026-7cff-4e8b-8b1d-26cacd2d603d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd" May 11 20:58:29.888078 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.888056 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd"] May 11 20:58:29.979679 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.979639 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvcdv\" (UniqueName: \"kubernetes.io/projected/c3f98026-7cff-4e8b-8b1d-26cacd2d603d-kube-api-access-kvcdv\") pod \"limitador-operator-controller-manager-85c4996f8c-xdtdd\" (UID: \"c3f98026-7cff-4e8b-8b1d-26cacd2d603d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd" May 11 20:58:29.991829 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:29.991793 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvcdv\" (UniqueName: \"kubernetes.io/projected/c3f98026-7cff-4e8b-8b1d-26cacd2d603d-kube-api-access-kvcdv\") pod \"limitador-operator-controller-manager-85c4996f8c-xdtdd\" (UID: \"c3f98026-7cff-4e8b-8b1d-26cacd2d603d\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd" May 11 20:58:30.215236 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:30.215193 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd" May 11 20:58:42.785498 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.785474 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" May 11 20:58:42.787981 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.787952 2562 status_manager.go:895] "Failed to get status for pod" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" err="pods \"limitador-operator-controller-manager-85c4996f8c-8h8t7\" is forbidden: User \"system:node:ip-10-0-135-190.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-190.ec2.internal' and this object" May 11 20:58:42.794807 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.794639 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd"] May 11 20:58:42.797358 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:58:42.797333 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3f98026_7cff_4e8b_8b1d_26cacd2d603d.slice/crio-db5bb670b4b59c8fcf918ed2dfe0a4f08d4ffbbb538154af85bbc0c513436af0 WatchSource:0}: Error finding container db5bb670b4b59c8fcf918ed2dfe0a4f08d4ffbbb538154af85bbc0c513436af0: Status 404 returned error can't find the container with id db5bb670b4b59c8fcf918ed2dfe0a4f08d4ffbbb538154af85bbc0c513436af0 May 11 20:58:42.888538 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.888512 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t78g2\" (UniqueName: \"kubernetes.io/projected/4ec695a5-cdc5-4b65-93dd-42acd4e1939b-kube-api-access-t78g2\") pod \"4ec695a5-cdc5-4b65-93dd-42acd4e1939b\" (UID: \"4ec695a5-cdc5-4b65-93dd-42acd4e1939b\") " May 11 20:58:42.890374 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.890351 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec695a5-cdc5-4b65-93dd-42acd4e1939b-kube-api-access-t78g2" (OuterVolumeSpecName: "kube-api-access-t78g2") pod "4ec695a5-cdc5-4b65-93dd-42acd4e1939b" (UID: "4ec695a5-cdc5-4b65-93dd-42acd4e1939b"). InnerVolumeSpecName "kube-api-access-t78g2". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:58:42.972973 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.972939 2562 generic.go:358] "Generic (PLEG): container finished" podID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" containerID="f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52" exitCode=0 May 11 20:58:42.973159 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.972988 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" May 11 20:58:42.973159 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.973036 2562 scope.go:117] "RemoveContainer" containerID="f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52" May 11 20:58:42.974468 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.974441 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" event={"ID":"2b54a001-6e04-4358-ab53-53a29a425298","Type":"ContainerStarted","Data":"97f29bdd8fc6500cf327c8813079bd5ebe730c8be21578ff79060887868c1938"} May 11 20:58:42.975529 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.975502 2562 status_manager.go:895] "Failed to get status for pod" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" err="pods \"limitador-operator-controller-manager-85c4996f8c-8h8t7\" is forbidden: User \"system:node:ip-10-0-135-190.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-190.ec2.internal' and this object" May 11 20:58:42.975934 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.975912 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd" event={"ID":"c3f98026-7cff-4e8b-8b1d-26cacd2d603d","Type":"ContainerStarted","Data":"97298fcdb93aac21464fc2224855bf8a2bc655f386e7278e506b2efd8e633cc8"} May 11 20:58:42.976038 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.975941 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd" event={"ID":"c3f98026-7cff-4e8b-8b1d-26cacd2d603d","Type":"ContainerStarted","Data":"db5bb670b4b59c8fcf918ed2dfe0a4f08d4ffbbb538154af85bbc0c513436af0"} May 11 20:58:42.976082 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.976054 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd" May 11 20:58:42.977505 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.977482 2562 status_manager.go:895] "Failed to get status for pod" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" err="pods \"limitador-operator-controller-manager-85c4996f8c-8h8t7\" is forbidden: User \"system:node:ip-10-0-135-190.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-190.ec2.internal' and this object" May 11 20:58:42.981258 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.981236 2562 scope.go:117] "RemoveContainer" containerID="f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52" May 11 20:58:42.981538 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:58:42.981516 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52\": container with ID starting with f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52 not found: ID does not exist" containerID="f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52" May 11 20:58:42.981632 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.981545 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52"} err="failed to get container status \"f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52\": rpc error: code = NotFound desc = could not find container \"f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52\": container with ID starting with f19bbdbadfbcdffc23ed7d4e37b1fdd03fd69e6abd9fc487b054ce1ec24bae52 not found: ID does not exist" May 11 20:58:42.989895 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.989876 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t78g2\" (UniqueName: \"kubernetes.io/projected/4ec695a5-cdc5-4b65-93dd-42acd4e1939b-kube-api-access-t78g2\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:58:42.994107 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:42.994069 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-6fqwc" podStartSLOduration=1.081257249 podStartE2EDuration="23.994060039s" podCreationTimestamp="2026-05-11 20:58:19 +0000 UTC" firstStartedPulling="2026-05-11 20:58:19.818369041 +0000 UTC m=+483.116513732" lastFinishedPulling="2026-05-11 20:58:42.731171832 +0000 UTC m=+506.029316522" observedRunningTime="2026-05-11 20:58:42.992751851 +0000 UTC m=+506.290896565" watchObservedRunningTime="2026-05-11 20:58:42.994060039 +0000 UTC m=+506.292204750" May 11 20:58:43.012366 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:43.012341 2562 status_manager.go:895] "Failed to get status for pod" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-8h8t7" err="pods \"limitador-operator-controller-manager-85c4996f8c-8h8t7\" is forbidden: User \"system:node:ip-10-0-135-190.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-135-190.ec2.internal' and this object" May 11 20:58:43.012673 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:43.012631 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd" podStartSLOduration=14.012620975 podStartE2EDuration="14.012620975s" podCreationTimestamp="2026-05-11 20:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:58:43.010399543 +0000 UTC m=+506.308544255" watchObservedRunningTime="2026-05-11 20:58:43.012620975 +0000 UTC m=+506.310765687" May 11 20:58:43.314638 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:43.314557 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec695a5-cdc5-4b65-93dd-42acd4e1939b" path="/var/lib/kubelet/pods/4ec695a5-cdc5-4b65-93dd-42acd4e1939b/volumes" May 11 20:58:53.982432 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:53.982401 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xdtdd" May 11 20:58:58.455854 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.455823 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c"] May 11 20:58:58.466274 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.466249 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c"] May 11 20:58:58.466416 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.466386 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.468818 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.468799 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-zwt49\"" May 11 20:58:58.609947 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.609910 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-envoy\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.610139 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.609951 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjwx5\" (UniqueName: \"kubernetes.io/projected/53e4322f-0f5a-4032-8514-7db4b87dc759-kube-api-access-vjwx5\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.610139 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.609986 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-workload-socket\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.610139 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.610032 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/53e4322f-0f5a-4032-8514-7db4b87dc759-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.610139 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.610127 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-data\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.610301 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.610157 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-credential-socket\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.610301 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.610182 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-workload-certs\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.610301 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.610220 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-podinfo\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.610301 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.610241 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-token\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711267 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711175 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-podinfo\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711267 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711221 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-token\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711267 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711262 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-envoy\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711518 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711288 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjwx5\" (UniqueName: \"kubernetes.io/projected/53e4322f-0f5a-4032-8514-7db4b87dc759-kube-api-access-vjwx5\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711518 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711319 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-workload-socket\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711518 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711342 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/53e4322f-0f5a-4032-8514-7db4b87dc759-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711518 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711400 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-data\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711518 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711424 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-credential-socket\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711518 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711476 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-workload-certs\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711836 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711812 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-workload-socket\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711894 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711865 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-data\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711947 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711919 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-workload-certs\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.711984 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.711941 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-credential-socket\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.712113 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.712095 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/53e4322f-0f5a-4032-8514-7db4b87dc759-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.714088 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.714060 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-envoy\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.714222 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.714205 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-podinfo\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.722595 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.722576 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/53e4322f-0f5a-4032-8514-7db4b87dc759-istio-token\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.723002 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.722986 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjwx5\" (UniqueName: \"kubernetes.io/projected/53e4322f-0f5a-4032-8514-7db4b87dc759-kube-api-access-vjwx5\") pod \"maas-default-gateway-openshift-default-7df95f575-pwd8c\" (UID: \"53e4322f-0f5a-4032-8514-7db4b87dc759\") " pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.778735 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.778711 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:58.906334 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.906303 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c"] May 11 20:58:58.909312 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:58:58.909283 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e4322f_0f5a_4032_8514_7db4b87dc759.slice/crio-92d9d7bf3555bbba9f2d61ea89c298f476824511bd0e4db94bf4f87f493748cd WatchSource:0}: Error finding container 92d9d7bf3555bbba9f2d61ea89c298f476824511bd0e4db94bf4f87f493748cd: Status 404 returned error can't find the container with id 92d9d7bf3555bbba9f2d61ea89c298f476824511bd0e4db94bf4f87f493748cd May 11 20:58:58.911525 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.911494 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} May 11 20:58:58.911592 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.911570 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} May 11 20:58:58.911630 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:58.911600 2562 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} May 11 20:58:59.031571 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:59.031532 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" event={"ID":"53e4322f-0f5a-4032-8514-7db4b87dc759","Type":"ContainerStarted","Data":"571c01c520adbadc58ff5d3afa94443f31d77203616184d7f3998c5e37386c74"} May 11 20:58:59.031571 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:59.031575 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" event={"ID":"53e4322f-0f5a-4032-8514-7db4b87dc759","Type":"ContainerStarted","Data":"92d9d7bf3555bbba9f2d61ea89c298f476824511bd0e4db94bf4f87f493748cd"} May 11 20:58:59.050647 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:59.050588 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" podStartSLOduration=1.050574467 podStartE2EDuration="1.050574467s" podCreationTimestamp="2026-05-11 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 20:58:59.049694903 +0000 UTC m=+522.347839615" watchObservedRunningTime="2026-05-11 20:58:59.050574467 +0000 UTC m=+522.348719179" May 11 20:58:59.779186 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:59.779151 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:58:59.784196 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:58:59.784173 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:59:00.034430 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:00.034337 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:59:00.035435 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:00.035418 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-7df95f575-pwd8c" May 11 20:59:03.761353 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:03.761318 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-kd46s"] May 11 20:59:03.892369 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:03.892336 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-kd46s"] May 11 20:59:03.892545 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:03.892401 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-kd46s" May 11 20:59:03.895201 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:03.895179 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-k2glf\"" May 11 20:59:03.954136 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:03.954107 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld5jc\" (UniqueName: \"kubernetes.io/projected/52161aa6-6a93-4c59-b8c0-841c4a9c9175-kube-api-access-ld5jc\") pod \"authorino-f99f4b5cd-kd46s\" (UID: \"52161aa6-6a93-4c59-b8c0-841c4a9c9175\") " pod="kuadrant-system/authorino-f99f4b5cd-kd46s" May 11 20:59:04.054870 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:04.054797 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld5jc\" (UniqueName: \"kubernetes.io/projected/52161aa6-6a93-4c59-b8c0-841c4a9c9175-kube-api-access-ld5jc\") pod \"authorino-f99f4b5cd-kd46s\" (UID: \"52161aa6-6a93-4c59-b8c0-841c4a9c9175\") " pod="kuadrant-system/authorino-f99f4b5cd-kd46s" May 11 20:59:04.067759 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:04.067729 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld5jc\" (UniqueName: \"kubernetes.io/projected/52161aa6-6a93-4c59-b8c0-841c4a9c9175-kube-api-access-ld5jc\") pod \"authorino-f99f4b5cd-kd46s\" (UID: \"52161aa6-6a93-4c59-b8c0-841c4a9c9175\") " pod="kuadrant-system/authorino-f99f4b5cd-kd46s" May 11 20:59:04.201497 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:04.201461 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-kd46s" May 11 20:59:04.331131 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:04.330974 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-kd46s"] May 11 20:59:04.333655 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:59:04.333623 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52161aa6_6a93_4c59_b8c0_841c4a9c9175.slice/crio-5a47aee85bb4e1584267f6cf3400695c8ce0b8ab685bc291a1367a3777b29beb WatchSource:0}: Error finding container 5a47aee85bb4e1584267f6cf3400695c8ce0b8ab685bc291a1367a3777b29beb: Status 404 returned error can't find the container with id 5a47aee85bb4e1584267f6cf3400695c8ce0b8ab685bc291a1367a3777b29beb May 11 20:59:05.053716 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:05.053664 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-kd46s" event={"ID":"52161aa6-6a93-4c59-b8c0-841c4a9c9175","Type":"ContainerStarted","Data":"5a47aee85bb4e1584267f6cf3400695c8ce0b8ab685bc291a1367a3777b29beb"} May 11 20:59:09.069993 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:09.069951 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-kd46s" event={"ID":"52161aa6-6a93-4c59-b8c0-841c4a9c9175","Type":"ContainerStarted","Data":"55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092"} May 11 20:59:09.086641 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:09.086594 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-kd46s" podStartSLOduration=1.8236990830000002 podStartE2EDuration="6.086579356s" podCreationTimestamp="2026-05-11 20:59:03 +0000 UTC" firstStartedPulling="2026-05-11 20:59:04.334875026 +0000 UTC m=+527.633019715" lastFinishedPulling="2026-05-11 20:59:08.597755299 +0000 UTC m=+531.895899988" observedRunningTime="2026-05-11 20:59:09.08480126 +0000 UTC m=+532.382945973" watchObservedRunningTime="2026-05-11 20:59:09.086579356 +0000 UTC m=+532.384724067" May 11 20:59:09.245845 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:09.245811 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-kd46s"] May 11 20:59:11.076206 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:11.076165 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-kd46s" podUID="52161aa6-6a93-4c59-b8c0-841c4a9c9175" containerName="authorino" containerID="cri-o://55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092" gracePeriod=30 May 11 20:59:11.851877 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:11.851853 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-kd46s" May 11 20:59:11.924286 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:11.924258 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld5jc\" (UniqueName: \"kubernetes.io/projected/52161aa6-6a93-4c59-b8c0-841c4a9c9175-kube-api-access-ld5jc\") pod \"52161aa6-6a93-4c59-b8c0-841c4a9c9175\" (UID: \"52161aa6-6a93-4c59-b8c0-841c4a9c9175\") " May 11 20:59:11.926262 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:11.926233 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52161aa6-6a93-4c59-b8c0-841c4a9c9175-kube-api-access-ld5jc" (OuterVolumeSpecName: "kube-api-access-ld5jc") pod "52161aa6-6a93-4c59-b8c0-841c4a9c9175" (UID: "52161aa6-6a93-4c59-b8c0-841c4a9c9175"). InnerVolumeSpecName "kube-api-access-ld5jc". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:59:12.025901 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:12.025822 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ld5jc\" (UniqueName: \"kubernetes.io/projected/52161aa6-6a93-4c59-b8c0-841c4a9c9175-kube-api-access-ld5jc\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:59:12.080332 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:12.080296 2562 generic.go:358] "Generic (PLEG): container finished" podID="52161aa6-6a93-4c59-b8c0-841c4a9c9175" containerID="55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092" exitCode=0 May 11 20:59:12.080695 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:12.080345 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-kd46s" May 11 20:59:12.080695 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:12.080378 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-kd46s" event={"ID":"52161aa6-6a93-4c59-b8c0-841c4a9c9175","Type":"ContainerDied","Data":"55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092"} May 11 20:59:12.080695 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:12.080421 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-kd46s" event={"ID":"52161aa6-6a93-4c59-b8c0-841c4a9c9175","Type":"ContainerDied","Data":"5a47aee85bb4e1584267f6cf3400695c8ce0b8ab685bc291a1367a3777b29beb"} May 11 20:59:12.080695 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:12.080437 2562 scope.go:117] "RemoveContainer" containerID="55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092" May 11 20:59:12.088805 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:12.088783 2562 scope.go:117] "RemoveContainer" containerID="55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092" May 11 20:59:12.089079 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:59:12.089057 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092\": container with ID starting with 55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092 not found: ID does not exist" containerID="55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092" May 11 20:59:12.089174 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:12.089084 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092"} err="failed to get container status \"55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092\": rpc error: code = NotFound desc = could not find container \"55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092\": container with ID starting with 55bd6f74095030980c6436e2e297beecb5b751e74354a3b7621b9bca58485092 not found: ID does not exist" May 11 20:59:12.101060 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:12.101034 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-kd46s"] May 11 20:59:12.102423 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:12.102402 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-kd46s"] May 11 20:59:13.314428 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:13.314391 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52161aa6-6a93-4c59-b8c0-841c4a9c9175" path="/var/lib/kubelet/pods/52161aa6-6a93-4c59-b8c0-841c4a9c9175/volumes" May 11 20:59:36.798316 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:36.798284 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-zwrtt"] May 11 20:59:36.798768 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:36.798591 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52161aa6-6a93-4c59-b8c0-841c4a9c9175" containerName="authorino" May 11 20:59:36.798768 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:36.798607 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="52161aa6-6a93-4c59-b8c0-841c4a9c9175" containerName="authorino" May 11 20:59:36.798768 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:36.798674 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="52161aa6-6a93-4c59-b8c0-841c4a9c9175" containerName="authorino" May 11 20:59:36.824235 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:36.824209 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-zwrtt"] May 11 20:59:36.824372 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:36.824306 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-zwrtt" May 11 20:59:36.826620 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:36.826600 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-k2glf\"" May 11 20:59:36.930709 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:36.930686 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-748w2\" (UniqueName: \"kubernetes.io/projected/afafcf74-9c7f-4404-ade5-1d2bc72d54dd-kube-api-access-748w2\") pod \"authorino-8b475cf9f-zwrtt\" (UID: \"afafcf74-9c7f-4404-ade5-1d2bc72d54dd\") " pod="kuadrant-system/authorino-8b475cf9f-zwrtt" May 11 20:59:37.031074 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.031046 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-748w2\" (UniqueName: \"kubernetes.io/projected/afafcf74-9c7f-4404-ade5-1d2bc72d54dd-kube-api-access-748w2\") pod \"authorino-8b475cf9f-zwrtt\" (UID: \"afafcf74-9c7f-4404-ade5-1d2bc72d54dd\") " pod="kuadrant-system/authorino-8b475cf9f-zwrtt" May 11 20:59:37.034228 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.034203 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-zwrtt"] May 11 20:59:37.034399 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:59:37.034381 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-748w2], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-zwrtt" podUID="afafcf74-9c7f-4404-ade5-1d2bc72d54dd" May 11 20:59:37.040242 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.040217 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-748w2\" (UniqueName: \"kubernetes.io/projected/afafcf74-9c7f-4404-ade5-1d2bc72d54dd-kube-api-access-748w2\") pod \"authorino-8b475cf9f-zwrtt\" (UID: \"afafcf74-9c7f-4404-ade5-1d2bc72d54dd\") " pod="kuadrant-system/authorino-8b475cf9f-zwrtt" May 11 20:59:37.064531 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.064476 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-54c97898d-dxk9h"] May 11 20:59:37.067810 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.067795 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54c97898d-dxk9h" May 11 20:59:37.074935 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.074915 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-54c97898d-dxk9h"] May 11 20:59:37.132072 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.132045 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sk8t\" (UniqueName: \"kubernetes.io/projected/6f1205fa-6794-4b53-a5b5-c41bc3cc7fee-kube-api-access-7sk8t\") pod \"authorino-54c97898d-dxk9h\" (UID: \"6f1205fa-6794-4b53-a5b5-c41bc3cc7fee\") " pod="kuadrant-system/authorino-54c97898d-dxk9h" May 11 20:59:37.159771 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.159743 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-zwrtt" May 11 20:59:37.163761 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.163744 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-zwrtt" May 11 20:59:37.232807 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.232787 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-748w2\" (UniqueName: \"kubernetes.io/projected/afafcf74-9c7f-4404-ade5-1d2bc72d54dd-kube-api-access-748w2\") pod \"afafcf74-9c7f-4404-ade5-1d2bc72d54dd\" (UID: \"afafcf74-9c7f-4404-ade5-1d2bc72d54dd\") " May 11 20:59:37.232939 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.232925 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sk8t\" (UniqueName: \"kubernetes.io/projected/6f1205fa-6794-4b53-a5b5-c41bc3cc7fee-kube-api-access-7sk8t\") pod \"authorino-54c97898d-dxk9h\" (UID: \"6f1205fa-6794-4b53-a5b5-c41bc3cc7fee\") " pod="kuadrant-system/authorino-54c97898d-dxk9h" May 11 20:59:37.234829 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.234805 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afafcf74-9c7f-4404-ade5-1d2bc72d54dd-kube-api-access-748w2" (OuterVolumeSpecName: "kube-api-access-748w2") pod "afafcf74-9c7f-4404-ade5-1d2bc72d54dd" (UID: "afafcf74-9c7f-4404-ade5-1d2bc72d54dd"). InnerVolumeSpecName "kube-api-access-748w2". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:59:37.241245 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.241226 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sk8t\" (UniqueName: \"kubernetes.io/projected/6f1205fa-6794-4b53-a5b5-c41bc3cc7fee-kube-api-access-7sk8t\") pod \"authorino-54c97898d-dxk9h\" (UID: \"6f1205fa-6794-4b53-a5b5-c41bc3cc7fee\") " pod="kuadrant-system/authorino-54c97898d-dxk9h" May 11 20:59:37.303305 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.303265 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-54c97898d-dxk9h"] May 11 20:59:37.303490 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.303477 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54c97898d-dxk9h" May 11 20:59:37.333664 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.333638 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-89ffbfd88-gv29n"] May 11 20:59:37.333953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.333933 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-748w2\" (UniqueName: \"kubernetes.io/projected/afafcf74-9c7f-4404-ade5-1d2bc72d54dd-kube-api-access-748w2\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:59:37.338586 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.338568 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-89ffbfd88-gv29n" May 11 20:59:37.341565 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.341535 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" May 11 20:59:37.343287 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.343261 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-89ffbfd88-gv29n"] May 11 20:59:37.435424 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.435357 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpzqc\" (UniqueName: \"kubernetes.io/projected/9b7d6939-346e-4577-bda4-2b88d002fa0b-kube-api-access-rpzqc\") pod \"authorino-89ffbfd88-gv29n\" (UID: \"9b7d6939-346e-4577-bda4-2b88d002fa0b\") " pod="kuadrant-system/authorino-89ffbfd88-gv29n" May 11 20:59:37.435609 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.435458 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9b7d6939-346e-4577-bda4-2b88d002fa0b-tls-cert\") pod \"authorino-89ffbfd88-gv29n\" (UID: \"9b7d6939-346e-4577-bda4-2b88d002fa0b\") " pod="kuadrant-system/authorino-89ffbfd88-gv29n" May 11 20:59:37.438358 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.438339 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-54c97898d-dxk9h"] May 11 20:59:37.440321 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:59:37.440296 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f1205fa_6794_4b53_a5b5_c41bc3cc7fee.slice/crio-16a64e65ca2157ce27b29eb35d7c8289ed7a613528fa1111948604d0e858fd6d WatchSource:0}: Error finding container 16a64e65ca2157ce27b29eb35d7c8289ed7a613528fa1111948604d0e858fd6d: Status 404 returned error can't find the container with id 16a64e65ca2157ce27b29eb35d7c8289ed7a613528fa1111948604d0e858fd6d May 11 20:59:37.536728 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.536696 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9b7d6939-346e-4577-bda4-2b88d002fa0b-tls-cert\") pod \"authorino-89ffbfd88-gv29n\" (UID: \"9b7d6939-346e-4577-bda4-2b88d002fa0b\") " pod="kuadrant-system/authorino-89ffbfd88-gv29n" May 11 20:59:37.536847 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.536776 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpzqc\" (UniqueName: \"kubernetes.io/projected/9b7d6939-346e-4577-bda4-2b88d002fa0b-kube-api-access-rpzqc\") pod \"authorino-89ffbfd88-gv29n\" (UID: \"9b7d6939-346e-4577-bda4-2b88d002fa0b\") " pod="kuadrant-system/authorino-89ffbfd88-gv29n" May 11 20:59:37.539032 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.539000 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9b7d6939-346e-4577-bda4-2b88d002fa0b-tls-cert\") pod \"authorino-89ffbfd88-gv29n\" (UID: \"9b7d6939-346e-4577-bda4-2b88d002fa0b\") " pod="kuadrant-system/authorino-89ffbfd88-gv29n" May 11 20:59:37.544809 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.544790 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpzqc\" (UniqueName: \"kubernetes.io/projected/9b7d6939-346e-4577-bda4-2b88d002fa0b-kube-api-access-rpzqc\") pod \"authorino-89ffbfd88-gv29n\" (UID: \"9b7d6939-346e-4577-bda4-2b88d002fa0b\") " pod="kuadrant-system/authorino-89ffbfd88-gv29n" May 11 20:59:37.649058 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.648976 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-89ffbfd88-gv29n" May 11 20:59:37.771710 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:37.771685 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-89ffbfd88-gv29n"] May 11 20:59:37.773952 ip-10-0-135-190 kubenswrapper[2562]: W0511 20:59:37.773926 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b7d6939_346e_4577_bda4_2b88d002fa0b.slice/crio-37ad4d4866539cc971a35ef10f57287d4c7e04bdb8e48baa088b2b2ca1b79844 WatchSource:0}: Error finding container 37ad4d4866539cc971a35ef10f57287d4c7e04bdb8e48baa088b2b2ca1b79844: Status 404 returned error can't find the container with id 37ad4d4866539cc971a35ef10f57287d4c7e04bdb8e48baa088b2b2ca1b79844 May 11 20:59:38.164456 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.164424 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54c97898d-dxk9h" event={"ID":"6f1205fa-6794-4b53-a5b5-c41bc3cc7fee","Type":"ContainerStarted","Data":"d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c"} May 11 20:59:38.164803 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.164467 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54c97898d-dxk9h" event={"ID":"6f1205fa-6794-4b53-a5b5-c41bc3cc7fee","Type":"ContainerStarted","Data":"16a64e65ca2157ce27b29eb35d7c8289ed7a613528fa1111948604d0e858fd6d"} May 11 20:59:38.164803 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.164506 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-54c97898d-dxk9h" podUID="6f1205fa-6794-4b53-a5b5-c41bc3cc7fee" containerName="authorino" containerID="cri-o://d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c" gracePeriod=30 May 11 20:59:38.165899 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.165879 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-zwrtt" May 11 20:59:38.166034 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.165873 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-89ffbfd88-gv29n" event={"ID":"9b7d6939-346e-4577-bda4-2b88d002fa0b","Type":"ContainerStarted","Data":"a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd"} May 11 20:59:38.166034 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.165928 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-89ffbfd88-gv29n" event={"ID":"9b7d6939-346e-4577-bda4-2b88d002fa0b","Type":"ContainerStarted","Data":"37ad4d4866539cc971a35ef10f57287d4c7e04bdb8e48baa088b2b2ca1b79844"} May 11 20:59:38.179286 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.179247 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-54c97898d-dxk9h" podStartSLOduration=0.744186046 podStartE2EDuration="1.179234979s" podCreationTimestamp="2026-05-11 20:59:37 +0000 UTC" firstStartedPulling="2026-05-11 20:59:37.441508243 +0000 UTC m=+560.739652932" lastFinishedPulling="2026-05-11 20:59:37.876557173 +0000 UTC m=+561.174701865" observedRunningTime="2026-05-11 20:59:38.178339942 +0000 UTC m=+561.476484654" watchObservedRunningTime="2026-05-11 20:59:38.179234979 +0000 UTC m=+561.477379689" May 11 20:59:38.193562 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.193517 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-89ffbfd88-gv29n" podStartSLOduration=0.887167932 podStartE2EDuration="1.193502785s" podCreationTimestamp="2026-05-11 20:59:37 +0000 UTC" firstStartedPulling="2026-05-11 20:59:37.775319662 +0000 UTC m=+561.073464351" lastFinishedPulling="2026-05-11 20:59:38.081654515 +0000 UTC m=+561.379799204" observedRunningTime="2026-05-11 20:59:38.192377129 +0000 UTC m=+561.490521850" watchObservedRunningTime="2026-05-11 20:59:38.193502785 +0000 UTC m=+561.491647558" May 11 20:59:38.217345 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.217311 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-zwrtt"] May 11 20:59:38.221145 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.221118 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-zwrtt"] May 11 20:59:38.399230 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.399207 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54c97898d-dxk9h" May 11 20:59:38.444207 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.444130 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sk8t\" (UniqueName: \"kubernetes.io/projected/6f1205fa-6794-4b53-a5b5-c41bc3cc7fee-kube-api-access-7sk8t\") pod \"6f1205fa-6794-4b53-a5b5-c41bc3cc7fee\" (UID: \"6f1205fa-6794-4b53-a5b5-c41bc3cc7fee\") " May 11 20:59:38.446144 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.446120 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1205fa-6794-4b53-a5b5-c41bc3cc7fee-kube-api-access-7sk8t" (OuterVolumeSpecName: "kube-api-access-7sk8t") pod "6f1205fa-6794-4b53-a5b5-c41bc3cc7fee" (UID: "6f1205fa-6794-4b53-a5b5-c41bc3cc7fee"). InnerVolumeSpecName "kube-api-access-7sk8t". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 20:59:38.545385 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:38.545359 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7sk8t\" (UniqueName: \"kubernetes.io/projected/6f1205fa-6794-4b53-a5b5-c41bc3cc7fee-kube-api-access-7sk8t\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 20:59:39.169888 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.169854 2562 generic.go:358] "Generic (PLEG): container finished" podID="6f1205fa-6794-4b53-a5b5-c41bc3cc7fee" containerID="d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c" exitCode=0 May 11 20:59:39.170363 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.169915 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54c97898d-dxk9h" May 11 20:59:39.170363 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.169944 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54c97898d-dxk9h" event={"ID":"6f1205fa-6794-4b53-a5b5-c41bc3cc7fee","Type":"ContainerDied","Data":"d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c"} May 11 20:59:39.170363 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.169979 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54c97898d-dxk9h" event={"ID":"6f1205fa-6794-4b53-a5b5-c41bc3cc7fee","Type":"ContainerDied","Data":"16a64e65ca2157ce27b29eb35d7c8289ed7a613528fa1111948604d0e858fd6d"} May 11 20:59:39.170363 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.169995 2562 scope.go:117] "RemoveContainer" containerID="d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c" May 11 20:59:39.177953 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.177934 2562 scope.go:117] "RemoveContainer" containerID="d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c" May 11 20:59:39.178226 ip-10-0-135-190 kubenswrapper[2562]: E0511 20:59:39.178208 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c\": container with ID starting with d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c not found: ID does not exist" containerID="d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c" May 11 20:59:39.178296 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.178233 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c"} err="failed to get container status \"d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c\": rpc error: code = NotFound desc = could not find container \"d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c\": container with ID starting with d589820614d2d99a754e094e2327fe13a32048d38ce51a718a7ed433e8aa244c not found: ID does not exist" May 11 20:59:39.190733 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.190704 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-54c97898d-dxk9h"] May 11 20:59:39.194197 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.194174 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-54c97898d-dxk9h"] May 11 20:59:39.314945 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.314867 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1205fa-6794-4b53-a5b5-c41bc3cc7fee" path="/var/lib/kubelet/pods/6f1205fa-6794-4b53-a5b5-c41bc3cc7fee/volumes" May 11 20:59:39.315260 ip-10-0-135-190 kubenswrapper[2562]: I0511 20:59:39.315244 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afafcf74-9c7f-4404-ade5-1d2bc72d54dd" path="/var/lib/kubelet/pods/afafcf74-9c7f-4404-ade5-1d2bc72d54dd/volumes" May 11 21:00:17.253531 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:17.253505 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:00:17.254694 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:17.254669 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:00:17.257926 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:17.257902 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:00:17.259224 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:17.259207 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:00:19.279921 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.279892 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh"] May 11 21:00:19.280305 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.280212 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f1205fa-6794-4b53-a5b5-c41bc3cc7fee" containerName="authorino" May 11 21:00:19.280305 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.280225 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1205fa-6794-4b53-a5b5-c41bc3cc7fee" containerName="authorino" May 11 21:00:19.280305 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.280289 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f1205fa-6794-4b53-a5b5-c41bc3cc7fee" containerName="authorino" May 11 21:00:19.284887 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.284870 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.287509 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.287487 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" May 11 21:00:19.288575 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.288547 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-cbxdf\"" May 11 21:00:19.288575 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.288567 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" May 11 21:00:19.288732 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.288654 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" May 11 21:00:19.293857 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.293837 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh"] May 11 21:00:19.475913 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.475877 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.476124 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.475962 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.476124 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.475999 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.476307 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.476067 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.476451 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.476387 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8ssd\" (UniqueName: \"kubernetes.io/projected/7f8fff36-e982-46d9-ab5b-55be58cfba20-kube-api-access-j8ssd\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.476451 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.476440 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8fff36-e982-46d9-ab5b-55be58cfba20-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.577692 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.577613 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.577692 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.577651 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.577692 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.577672 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.577692 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.577691 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8ssd\" (UniqueName: \"kubernetes.io/projected/7f8fff36-e982-46d9-ab5b-55be58cfba20-kube-api-access-j8ssd\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.578057 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.577710 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8fff36-e982-46d9-ab5b-55be58cfba20-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.578057 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.577901 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.578057 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.578057 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.578212 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.578125 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.578212 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.578196 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.579932 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.579914 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7f8fff36-e982-46d9-ab5b-55be58cfba20-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.580208 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.580191 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f8fff36-e982-46d9-ab5b-55be58cfba20-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.586363 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.586340 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8ssd\" (UniqueName: \"kubernetes.io/projected/7f8fff36-e982-46d9-ab5b-55be58cfba20-kube-api-access-j8ssd\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh\" (UID: \"7f8fff36-e982-46d9-ab5b-55be58cfba20\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.595337 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.595316 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:19.734553 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:19.734528 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh"] May 11 21:00:19.736929 ip-10-0-135-190 kubenswrapper[2562]: W0511 21:00:19.736895 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f8fff36_e982_46d9_ab5b_55be58cfba20.slice/crio-a29dbbcb3a0dc474c9d8910cd292a0ec1f443eda9e3d16872a5663075aab9433 WatchSource:0}: Error finding container a29dbbcb3a0dc474c9d8910cd292a0ec1f443eda9e3d16872a5663075aab9433: Status 404 returned error can't find the container with id a29dbbcb3a0dc474c9d8910cd292a0ec1f443eda9e3d16872a5663075aab9433 May 11 21:00:20.315150 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:20.315110 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" event={"ID":"7f8fff36-e982-46d9-ab5b-55be58cfba20","Type":"ContainerStarted","Data":"a29dbbcb3a0dc474c9d8910cd292a0ec1f443eda9e3d16872a5663075aab9433"} May 11 21:00:26.432154 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.432122 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278"] May 11 21:00:26.478816 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.478774 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278"] May 11 21:00:26.478993 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.478943 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.481780 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.481560 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" May 11 21:00:26.535950 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.535919 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtbng\" (UniqueName: \"kubernetes.io/projected/529c5254-c8df-4838-bdf0-32290763f25e-kube-api-access-dtbng\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.536097 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.535978 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/529c5254-c8df-4838-bdf0-32290763f25e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.536097 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.536082 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.536168 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.536109 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.536168 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.536139 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.536234 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.536175 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.636898 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.636863 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/529c5254-c8df-4838-bdf0-32290763f25e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.637062 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.636927 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.637062 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.636948 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.637062 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.636975 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.637062 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.637002 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.637062 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.637061 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtbng\" (UniqueName: \"kubernetes.io/projected/529c5254-c8df-4838-bdf0-32290763f25e-kube-api-access-dtbng\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.637463 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.637436 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.637562 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.637502 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.637799 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.637622 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.642107 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.642084 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/529c5254-c8df-4838-bdf0-32290763f25e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.645305 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.644481 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/529c5254-c8df-4838-bdf0-32290763f25e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.646988 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.646964 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtbng\" (UniqueName: \"kubernetes.io/projected/529c5254-c8df-4838-bdf0-32290763f25e-kube-api-access-dtbng\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278\" (UID: \"529c5254-c8df-4838-bdf0-32290763f25e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.794907 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.794868 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:26.934586 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:26.934557 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278"] May 11 21:00:26.935816 ip-10-0-135-190 kubenswrapper[2562]: W0511 21:00:26.935790 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod529c5254_c8df_4838_bdf0_32290763f25e.slice/crio-43d43e383b8fb032e40ac15d0dfe1a8d9da963425d28ab6d1d43104956bb1da0 WatchSource:0}: Error finding container 43d43e383b8fb032e40ac15d0dfe1a8d9da963425d28ab6d1d43104956bb1da0: Status 404 returned error can't find the container with id 43d43e383b8fb032e40ac15d0dfe1a8d9da963425d28ab6d1d43104956bb1da0 May 11 21:00:27.340769 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:27.340732 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" event={"ID":"529c5254-c8df-4838-bdf0-32290763f25e","Type":"ContainerStarted","Data":"f30b603809f85dc4491b25e0abe06968f2e1c2cc0bd9252313ece985f12d7d4b"} May 11 21:00:27.340769 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:27.340774 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" event={"ID":"529c5254-c8df-4838-bdf0-32290763f25e","Type":"ContainerStarted","Data":"43d43e383b8fb032e40ac15d0dfe1a8d9da963425d28ab6d1d43104956bb1da0"} May 11 21:00:27.342256 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:27.342228 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" event={"ID":"7f8fff36-e982-46d9-ab5b-55be58cfba20","Type":"ContainerStarted","Data":"8055686c40a8b61fba021e088c8389f6badd0032e001441983b7f279a41a03eb"} May 11 21:00:33.366611 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:33.366579 2562 generic.go:358] "Generic (PLEG): container finished" podID="529c5254-c8df-4838-bdf0-32290763f25e" containerID="f30b603809f85dc4491b25e0abe06968f2e1c2cc0bd9252313ece985f12d7d4b" exitCode=0 May 11 21:00:33.366990 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:33.366661 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" event={"ID":"529c5254-c8df-4838-bdf0-32290763f25e","Type":"ContainerDied","Data":"f30b603809f85dc4491b25e0abe06968f2e1c2cc0bd9252313ece985f12d7d4b"} May 11 21:00:35.375392 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:35.375318 2562 generic.go:358] "Generic (PLEG): container finished" podID="7f8fff36-e982-46d9-ab5b-55be58cfba20" containerID="8055686c40a8b61fba021e088c8389f6badd0032e001441983b7f279a41a03eb" exitCode=0 May 11 21:00:35.375392 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:35.375376 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" event={"ID":"7f8fff36-e982-46d9-ab5b-55be58cfba20","Type":"ContainerDied","Data":"8055686c40a8b61fba021e088c8389f6badd0032e001441983b7f279a41a03eb"} May 11 21:00:37.385763 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.385723 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" event={"ID":"7f8fff36-e982-46d9-ab5b-55be58cfba20","Type":"ContainerStarted","Data":"68781302bd1ee036e3e34c52a728010d64a7d33ae967c83cb03cb83ec6dce2de"} May 11 21:00:37.386198 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.386036 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:37.407178 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.407134 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" podStartSLOduration=1.480847777 podStartE2EDuration="18.407120785s" podCreationTimestamp="2026-05-11 21:00:19 +0000 UTC" firstStartedPulling="2026-05-11 21:00:19.738834597 +0000 UTC m=+603.036979300" lastFinishedPulling="2026-05-11 21:00:36.665107618 +0000 UTC m=+619.963252308" observedRunningTime="2026-05-11 21:00:37.405512311 +0000 UTC m=+620.703657024" watchObservedRunningTime="2026-05-11 21:00:37.407120785 +0000 UTC m=+620.705265495" May 11 21:00:37.482723 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.482694 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t"] May 11 21:00:37.486741 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.486726 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.489164 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.489145 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" May 11 21:00:37.497415 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.497395 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t"] May 11 21:00:37.532768 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.532740 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.532879 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.532790 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.532879 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.532822 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s74ns\" (UniqueName: \"kubernetes.io/projected/305fc11d-a311-4c26-8966-cf4370ceb203-kube-api-access-s74ns\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.532879 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.532871 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.532987 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.532946 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/305fc11d-a311-4c26-8966-cf4370ceb203-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.532987 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.532981 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.634307 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.634284 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/305fc11d-a311-4c26-8966-cf4370ceb203-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.634439 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.634317 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.634439 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.634369 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.634439 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.634399 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.634439 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.634426 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s74ns\" (UniqueName: \"kubernetes.io/projected/305fc11d-a311-4c26-8966-cf4370ceb203-kube-api-access-s74ns\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.634603 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.634475 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.634755 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.634730 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.634824 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.634758 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.634824 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.634813 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.636547 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.636502 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/305fc11d-a311-4c26-8966-cf4370ceb203-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.636891 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.636871 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/305fc11d-a311-4c26-8966-cf4370ceb203-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.641954 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.641928 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s74ns\" (UniqueName: \"kubernetes.io/projected/305fc11d-a311-4c26-8966-cf4370ceb203-kube-api-access-s74ns\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t\" (UID: \"305fc11d-a311-4c26-8966-cf4370ceb203\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.797027 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.796979 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:00:37.929959 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:37.929935 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t"] May 11 21:00:37.932362 ip-10-0-135-190 kubenswrapper[2562]: W0511 21:00:37.932332 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod305fc11d_a311_4c26_8966_cf4370ceb203.slice/crio-e4ed2454612d00b5e38389c82bb223c2caeef8c30558439df7bdda5552c93402 WatchSource:0}: Error finding container e4ed2454612d00b5e38389c82bb223c2caeef8c30558439df7bdda5552c93402: Status 404 returned error can't find the container with id e4ed2454612d00b5e38389c82bb223c2caeef8c30558439df7bdda5552c93402 May 11 21:00:38.390535 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:38.390495 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" event={"ID":"305fc11d-a311-4c26-8966-cf4370ceb203","Type":"ContainerStarted","Data":"4095cfdde6b52f0973e25deaa9d6cc69a2fa145e4d793df7c840405e0517aa1c"} May 11 21:00:38.390535 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:38.390541 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" event={"ID":"305fc11d-a311-4c26-8966-cf4370ceb203","Type":"ContainerStarted","Data":"e4ed2454612d00b5e38389c82bb223c2caeef8c30558439df7bdda5552c93402"} May 11 21:00:43.409059 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:43.409026 2562 generic.go:358] "Generic (PLEG): container finished" podID="305fc11d-a311-4c26-8966-cf4370ceb203" containerID="4095cfdde6b52f0973e25deaa9d6cc69a2fa145e4d793df7c840405e0517aa1c" exitCode=0 May 11 21:00:43.409357 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:43.409078 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" event={"ID":"305fc11d-a311-4c26-8966-cf4370ceb203","Type":"ContainerDied","Data":"4095cfdde6b52f0973e25deaa9d6cc69a2fa145e4d793df7c840405e0517aa1c"} May 11 21:00:48.404384 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:48.404351 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh" May 11 21:00:49.429626 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:49.429593 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" event={"ID":"529c5254-c8df-4838-bdf0-32290763f25e","Type":"ContainerStarted","Data":"ec85e9ab3854c438403ba516cc4616c5372275f1ad70881c5c8ff3033167d091"} May 11 21:00:49.430050 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:49.429807 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:00:49.449782 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:00:49.449716 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" podStartSLOduration=8.210195591 podStartE2EDuration="23.449702581s" podCreationTimestamp="2026-05-11 21:00:26 +0000 UTC" firstStartedPulling="2026-05-11 21:00:33.36784814 +0000 UTC m=+616.665992846" lastFinishedPulling="2026-05-11 21:00:48.607355147 +0000 UTC m=+631.905499836" observedRunningTime="2026-05-11 21:00:49.449560642 +0000 UTC m=+632.747705354" watchObservedRunningTime="2026-05-11 21:00:49.449702581 +0000 UTC m=+632.747847295" May 11 21:01:00.445558 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:00.445525 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278" May 11 21:01:05.935373 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:05.935333 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-85899c578f-5rw88"] May 11 21:01:05.938731 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:05.938715 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85899c578f-5rw88" May 11 21:01:05.946952 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:05.946927 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85899c578f-5rw88"] May 11 21:01:06.078107 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:06.078069 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln4f6\" (UniqueName: \"kubernetes.io/projected/27e579bf-250b-45f9-98f2-1aae247b31b9-kube-api-access-ln4f6\") pod \"authorino-85899c578f-5rw88\" (UID: \"27e579bf-250b-45f9-98f2-1aae247b31b9\") " pod="kuadrant-system/authorino-85899c578f-5rw88" May 11 21:01:06.078273 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:06.078165 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/27e579bf-250b-45f9-98f2-1aae247b31b9-tls-cert\") pod \"authorino-85899c578f-5rw88\" (UID: \"27e579bf-250b-45f9-98f2-1aae247b31b9\") " pod="kuadrant-system/authorino-85899c578f-5rw88" May 11 21:01:06.178795 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:06.178764 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ln4f6\" (UniqueName: \"kubernetes.io/projected/27e579bf-250b-45f9-98f2-1aae247b31b9-kube-api-access-ln4f6\") pod \"authorino-85899c578f-5rw88\" (UID: \"27e579bf-250b-45f9-98f2-1aae247b31b9\") " pod="kuadrant-system/authorino-85899c578f-5rw88" May 11 21:01:06.178933 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:06.178809 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/27e579bf-250b-45f9-98f2-1aae247b31b9-tls-cert\") pod \"authorino-85899c578f-5rw88\" (UID: \"27e579bf-250b-45f9-98f2-1aae247b31b9\") " pod="kuadrant-system/authorino-85899c578f-5rw88" May 11 21:01:06.181286 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:06.181264 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/27e579bf-250b-45f9-98f2-1aae247b31b9-tls-cert\") pod \"authorino-85899c578f-5rw88\" (UID: \"27e579bf-250b-45f9-98f2-1aae247b31b9\") " pod="kuadrant-system/authorino-85899c578f-5rw88" May 11 21:01:06.186884 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:06.186829 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln4f6\" (UniqueName: \"kubernetes.io/projected/27e579bf-250b-45f9-98f2-1aae247b31b9-kube-api-access-ln4f6\") pod \"authorino-85899c578f-5rw88\" (UID: \"27e579bf-250b-45f9-98f2-1aae247b31b9\") " pod="kuadrant-system/authorino-85899c578f-5rw88" May 11 21:01:06.248888 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:06.248868 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85899c578f-5rw88" May 11 21:01:06.376855 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:06.376829 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85899c578f-5rw88"] May 11 21:01:06.379032 ip-10-0-135-190 kubenswrapper[2562]: W0511 21:01:06.378989 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27e579bf_250b_45f9_98f2_1aae247b31b9.slice/crio-7c0586700b1b968936ee08eb7a33c5831288089eeac84a0374bf977f75f43830 WatchSource:0}: Error finding container 7c0586700b1b968936ee08eb7a33c5831288089eeac84a0374bf977f75f43830: Status 404 returned error can't find the container with id 7c0586700b1b968936ee08eb7a33c5831288089eeac84a0374bf977f75f43830 May 11 21:01:06.484529 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:06.484497 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85899c578f-5rw88" event={"ID":"27e579bf-250b-45f9-98f2-1aae247b31b9","Type":"ContainerStarted","Data":"7c0586700b1b968936ee08eb7a33c5831288089eeac84a0374bf977f75f43830"} May 11 21:01:07.490027 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:07.489975 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85899c578f-5rw88" event={"ID":"27e579bf-250b-45f9-98f2-1aae247b31b9","Type":"ContainerStarted","Data":"097bf85965063749bb92d701164b9bc8852e074d84ad40f90aaf47a612682847"} May 11 21:01:07.514259 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:07.514193 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-85899c578f-5rw88" podStartSLOduration=2.072251012 podStartE2EDuration="2.514173752s" podCreationTimestamp="2026-05-11 21:01:05 +0000 UTC" firstStartedPulling="2026-05-11 21:01:06.380508699 +0000 UTC m=+649.678653405" lastFinishedPulling="2026-05-11 21:01:06.822431452 +0000 UTC m=+650.120576145" observedRunningTime="2026-05-11 21:01:07.512736429 +0000 UTC m=+650.810881139" watchObservedRunningTime="2026-05-11 21:01:07.514173752 +0000 UTC m=+650.812318462" May 11 21:01:07.564987 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:07.564952 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-89ffbfd88-gv29n"] May 11 21:01:07.565533 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:07.565497 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-89ffbfd88-gv29n" podUID="9b7d6939-346e-4577-bda4-2b88d002fa0b" containerName="authorino" containerID="cri-o://a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd" gracePeriod=30 May 11 21:01:07.812488 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:07.812460 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-89ffbfd88-gv29n" May 11 21:01:07.998831 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:07.998795 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpzqc\" (UniqueName: \"kubernetes.io/projected/9b7d6939-346e-4577-bda4-2b88d002fa0b-kube-api-access-rpzqc\") pod \"9b7d6939-346e-4577-bda4-2b88d002fa0b\" (UID: \"9b7d6939-346e-4577-bda4-2b88d002fa0b\") " May 11 21:01:07.998831 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:07.998838 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9b7d6939-346e-4577-bda4-2b88d002fa0b-tls-cert\") pod \"9b7d6939-346e-4577-bda4-2b88d002fa0b\" (UID: \"9b7d6939-346e-4577-bda4-2b88d002fa0b\") " May 11 21:01:08.000978 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.000940 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7d6939-346e-4577-bda4-2b88d002fa0b-kube-api-access-rpzqc" (OuterVolumeSpecName: "kube-api-access-rpzqc") pod "9b7d6939-346e-4577-bda4-2b88d002fa0b" (UID: "9b7d6939-346e-4577-bda4-2b88d002fa0b"). InnerVolumeSpecName "kube-api-access-rpzqc". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 11 21:01:08.008844 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.008819 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7d6939-346e-4577-bda4-2b88d002fa0b-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "9b7d6939-346e-4577-bda4-2b88d002fa0b" (UID: "9b7d6939-346e-4577-bda4-2b88d002fa0b"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 11 21:01:08.099991 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.099918 2562 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9b7d6939-346e-4577-bda4-2b88d002fa0b-tls-cert\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 21:01:08.099991 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.099958 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rpzqc\" (UniqueName: \"kubernetes.io/projected/9b7d6939-346e-4577-bda4-2b88d002fa0b-kube-api-access-rpzqc\") on node \"ip-10-0-135-190.ec2.internal\" DevicePath \"\"" May 11 21:01:08.494126 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.494086 2562 generic.go:358] "Generic (PLEG): container finished" podID="9b7d6939-346e-4577-bda4-2b88d002fa0b" containerID="a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd" exitCode=0 May 11 21:01:08.494571 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.494136 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-89ffbfd88-gv29n" May 11 21:01:08.494571 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.494168 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-89ffbfd88-gv29n" event={"ID":"9b7d6939-346e-4577-bda4-2b88d002fa0b","Type":"ContainerDied","Data":"a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd"} May 11 21:01:08.494571 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.494206 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-89ffbfd88-gv29n" event={"ID":"9b7d6939-346e-4577-bda4-2b88d002fa0b","Type":"ContainerDied","Data":"37ad4d4866539cc971a35ef10f57287d4c7e04bdb8e48baa088b2b2ca1b79844"} May 11 21:01:08.494571 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.494218 2562 scope.go:117] "RemoveContainer" containerID="a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd" May 11 21:01:08.502473 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.502455 2562 scope.go:117] "RemoveContainer" containerID="a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd" May 11 21:01:08.502734 ip-10-0-135-190 kubenswrapper[2562]: E0511 21:01:08.502709 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd\": container with ID starting with a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd not found: ID does not exist" containerID="a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd" May 11 21:01:08.502822 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.502735 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd"} err="failed to get container status \"a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd\": rpc error: code = NotFound desc = could not find container \"a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd\": container with ID starting with a22c9ffb16e8adb1355d7d2e6ce5ac2945f65556ec44ae39d3f01fea0e98cfcd not found: ID does not exist" May 11 21:01:08.514751 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.514732 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-89ffbfd88-gv29n"] May 11 21:01:08.519226 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:08.519207 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-89ffbfd88-gv29n"] May 11 21:01:09.314259 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:09.314225 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7d6939-346e-4577-bda4-2b88d002fa0b" path="/var/lib/kubelet/pods/9b7d6939-346e-4577-bda4-2b88d002fa0b/volumes" May 11 21:01:20.535099 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:20.535066 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" event={"ID":"305fc11d-a311-4c26-8966-cf4370ceb203","Type":"ContainerStarted","Data":"566ac22871c49aa37803fb3c8b177cec17641228e1d77f289ccdeffee801e1df"} May 11 21:01:20.535461 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:20.535270 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:01:20.554887 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:20.554836 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" podStartSLOduration=7.378059047 podStartE2EDuration="43.554823345s" podCreationTimestamp="2026-05-11 21:00:37 +0000 UTC" firstStartedPulling="2026-05-11 21:00:43.4096683 +0000 UTC m=+626.707812989" lastFinishedPulling="2026-05-11 21:01:19.586432581 +0000 UTC m=+662.884577287" observedRunningTime="2026-05-11 21:01:20.553773239 +0000 UTC m=+663.851917961" watchObservedRunningTime="2026-05-11 21:01:20.554823345 +0000 UTC m=+663.852968055" May 11 21:01:31.551303 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:01:31.551273 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t" May 11 21:05:17.280494 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:05:17.280424 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:05:17.280494 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:05:17.280424 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:05:17.285584 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:05:17.285564 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:05:17.285719 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:05:17.285603 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:10:17.304040 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:10:17.303996 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:10:17.304933 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:10:17.304906 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:10:17.315461 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:10:17.315432 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:10:17.316656 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:10:17.316634 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:15:17.334360 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:15:17.334333 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:15:17.337239 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:15:17.337218 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:15:17.338899 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:15:17.338880 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:15:17.341560 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:15:17.341543 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:20:17.359397 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:20:17.359368 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:20:17.363493 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:20:17.363469 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:20:17.364723 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:20:17.364700 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:20:17.369315 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:20:17.369294 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:23:39.066721 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:39.066688 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-85899c578f-5rw88_27e579bf-250b-45f9-98f2-1aae247b31b9/authorino/0.log" May 11 21:23:43.981820 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:43.981785 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-755c95f69f-vwxsj_4fe0c3de-20cd-4ad3-83a8-876b1eebe765/manager/0.log" May 11 21:23:45.292511 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:45.292480 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-85899c578f-5rw88_27e579bf-250b-45f9-98f2-1aae247b31b9/authorino/0.log" May 11 21:23:45.514329 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:45.514298 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-xvb27_97722e29-4ce2-433e-8f05-24777f7757e9/manager/0.log" May 11 21:23:45.619952 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:45.619877 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-6fqwc_2b54a001-6e04-4358-ab53-53a29a425298/kuadrant-console-plugin/0.log" May 11 21:23:46.080972 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:46.080936 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-xdtdd_c3f98026-7cff-4e8b-8b1d-26cacd2d603d/manager/0.log" May 11 21:23:46.405343 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:46.405266 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-595b7776f85sn8z_77e76c93-c30b-4154-b356-fe63f4d57502/istio-proxy/0.log" May 11 21:23:46.852543 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:46.852508 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-7df95f575-pwd8c_53e4322f-0f5a-4032-8514-7db4b87dc759/istio-proxy/0.log" May 11 21:23:47.064018 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:47.063976 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7589cfd5f4-pdcvt_ea8d418d-c309-4692-8be3-e3a7eeb22225/router/0.log" May 11 21:23:47.400707 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:47.400675 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t_305fc11d-a311-4c26-8966-cf4370ceb203/storage-initializer/0.log" May 11 21:23:47.407193 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:47.407167 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-nmv7t_305fc11d-a311-4c26-8966-cf4370ceb203/main/0.log" May 11 21:23:47.852029 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:47.851980 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278_529c5254-c8df-4838-bdf0-32290763f25e/main/0.log" May 11 21:23:47.858375 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:47.858354 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-7x278_529c5254-c8df-4838-bdf0-32290763f25e/storage-initializer/0.log" May 11 21:23:47.961455 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:47.961430 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh_7f8fff36-e982-46d9-ab5b-55be58cfba20/storage-initializer/0.log" May 11 21:23:47.967708 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:47.967684 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-cxvqh_7f8fff36-e982-46d9-ab5b-55be58cfba20/main/0.log" May 11 21:23:59.956838 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:23:59.956806 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9qsqx_d85defd8-e86e-4d13-9e13-373afa866baa/global-pull-secret-syncer/0.log" May 11 21:24:00.106863 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:00.106833 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-grk72_c19352cd-f3ce-49f5-99aa-571926768a56/konnectivity-agent/0.log" May 11 21:24:00.199023 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:00.198983 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-190.ec2.internal_c5f0e82f4ac1559ad6c0ea2fd6d8dd2a/haproxy/0.log" May 11 21:24:04.634632 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:04.634597 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-85899c578f-5rw88_27e579bf-250b-45f9-98f2-1aae247b31b9/authorino/0.log" May 11 21:24:04.712114 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:04.712082 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-xvb27_97722e29-4ce2-433e-8f05-24777f7757e9/manager/0.log" May 11 21:24:04.746556 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:04.746524 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-6fqwc_2b54a001-6e04-4358-ab53-53a29a425298/kuadrant-console-plugin/0.log" May 11 21:24:04.976459 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:04.976423 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-xdtdd_c3f98026-7cff-4e8b-8b1d-26cacd2d603d/manager/0.log" May 11 21:24:06.556725 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:06.556695 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-5c487d988c-gldtx_a8dd435c-a454-4ae4-935a-67c1f9c9ec81/cluster-monitoring-operator/0.log" May 11 21:24:06.846802 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:06.846726 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gcmjj_b98ddfda-a947-4d98-b77a-55a1bddfc8e4/node-exporter/0.log" May 11 21:24:06.869756 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:06.869732 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gcmjj_b98ddfda-a947-4d98-b77a-55a1bddfc8e4/kube-rbac-proxy/0.log" May 11 21:24:06.897245 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:06.897224 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gcmjj_b98ddfda-a947-4d98-b77a-55a1bddfc8e4/init-textfile/0.log" May 11 21:24:08.603171 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.603141 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-697665887d-5q7f6_17f13489-a9eb-4f66-85c8-6967aa3ec01a/networking-console-plugin/0.log" May 11 21:24:08.766994 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.764436 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5"] May 11 21:24:08.766994 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.765286 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b7d6939-346e-4577-bda4-2b88d002fa0b" containerName="authorino" May 11 21:24:08.766994 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.765313 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7d6939-346e-4577-bda4-2b88d002fa0b" containerName="authorino" May 11 21:24:08.766994 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.765475 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b7d6939-346e-4577-bda4-2b88d002fa0b" containerName="authorino" May 11 21:24:08.768741 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.768718 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:08.772528 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.772502 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qcz2k\"/\"kube-root-ca.crt\"" May 11 21:24:08.772664 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.772510 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qcz2k\"/\"openshift-service-ca.crt\"" May 11 21:24:08.772664 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.772512 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qcz2k\"/\"default-dockercfg-4h4cq\"" May 11 21:24:08.773628 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.773602 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5"] May 11 21:24:08.945506 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.945429 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-proc\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:08.945506 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.945469 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-sys\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:08.945795 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.945575 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tq7m\" (UniqueName: \"kubernetes.io/projected/6ebfc92d-25c2-4e63-a020-783ef354b5ac-kube-api-access-5tq7m\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:08.945795 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.945629 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-podres\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:08.945795 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:08.945705 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-lib-modules\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.046274 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.046226 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-lib-modules\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.046487 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.046314 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-proc\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.046487 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.046338 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-sys\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.046487 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.046370 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tq7m\" (UniqueName: \"kubernetes.io/projected/6ebfc92d-25c2-4e63-a020-783ef354b5ac-kube-api-access-5tq7m\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.046487 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.046401 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-podres\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.046487 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.046443 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-proc\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.046487 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.046447 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-sys\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.046487 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.046444 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-lib-modules\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.046714 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.046504 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6ebfc92d-25c2-4e63-a020-783ef354b5ac-podres\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.054923 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.054893 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tq7m\" (UniqueName: \"kubernetes.io/projected/6ebfc92d-25c2-4e63-a020-783ef354b5ac-kube-api-access-5tq7m\") pod \"perf-node-gather-daemonset-z2hf5\" (UID: \"6ebfc92d-25c2-4e63-a020-783ef354b5ac\") " pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.080727 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.080706 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:09.189952 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.189927 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/2.log" May 11 21:24:09.194558 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.194532 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-77758f4558-kfkm4_bffdd990-0f6a-4e43-a62a-94c91746d6fc/console-operator/3.log" May 11 21:24:09.212114 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.212093 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5"] May 11 21:24:09.213806 ip-10-0-135-190 kubenswrapper[2562]: W0511 21:24:09.213772 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6ebfc92d_25c2_4e63_a020_783ef354b5ac.slice/crio-6e7512e159ca4975d705db7522f45c7aca9f7542c58bd750ad50426589eb3716 WatchSource:0}: Error finding container 6e7512e159ca4975d705db7522f45c7aca9f7542c58bd750ad50426589eb3716: Status 404 returned error can't find the container with id 6e7512e159ca4975d705db7522f45c7aca9f7542c58bd750ad50426589eb3716 May 11 21:24:09.215421 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:09.215402 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider May 11 21:24:10.165878 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:10.165840 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" event={"ID":"6ebfc92d-25c2-4e63-a020-783ef354b5ac","Type":"ContainerStarted","Data":"e1a34d09fbe93b0633825ed8f57460a5accf55eda1dedfb204a37b7c863d9533"} May 11 21:24:10.165878 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:10.165880 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" event={"ID":"6ebfc92d-25c2-4e63-a020-783ef354b5ac","Type":"ContainerStarted","Data":"6e7512e159ca4975d705db7522f45c7aca9f7542c58bd750ad50426589eb3716"} May 11 21:24:10.166406 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:10.165911 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:10.183139 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:10.183093 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" podStartSLOduration=2.183073744 podStartE2EDuration="2.183073744s" podCreationTimestamp="2026-05-11 21:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-05-11 21:24:10.181512714 +0000 UTC m=+2033.479657449" watchObservedRunningTime="2026-05-11 21:24:10.183073744 +0000 UTC m=+2033.481218458" May 11 21:24:10.202251 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:10.202228 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-6648d555c9-kgwqp_c0e49337-8925-47a4-9b6a-95c7bd4e9887/volume-data-source-validator/0.log" May 11 21:24:11.051816 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:11.051785 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n8sxs_7ff2f12c-70ce-4a2c-8828-4562a60dc95d/dns/0.log" May 11 21:24:11.070069 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:11.070046 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-n8sxs_7ff2f12c-70ce-4a2c-8828-4562a60dc95d/kube-rbac-proxy/0.log" May 11 21:24:11.134020 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:11.133992 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zhs5x_9d55bb28-de13-44d3-9322-9b22abc5dc03/dns-node-resolver/0.log" May 11 21:24:11.616164 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:11.616113 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-p6nhn_a64fefbb-edf9-4ffa-adf6-0602e2c7e71b/node-ca/0.log" May 11 21:24:12.422895 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:12.422863 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-595b7776f85sn8z_77e76c93-c30b-4154-b356-fe63f4d57502/istio-proxy/0.log" May 11 21:24:12.687124 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:12.686987 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-7df95f575-pwd8c_53e4322f-0f5a-4032-8514-7db4b87dc759/istio-proxy/0.log" May 11 21:24:12.762995 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:12.762948 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7589cfd5f4-pdcvt_ea8d418d-c309-4692-8be3-e3a7eeb22225/router/0.log" May 11 21:24:13.280004 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:13.279970 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-p92nn_9d81ee0b-b7dc-45a9-bc60-e7389a88feb1/serve-healthcheck-canary/0.log" May 11 21:24:13.723251 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:13.723223 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-544c98cc96-j75m5_0311189e-d497-4d2c-a742-ad52f624750a/insights-operator/0.log" May 11 21:24:13.723774 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:13.723754 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-544c98cc96-j75m5_0311189e-d497-4d2c-a742-ad52f624750a/insights-operator/1.log" May 11 21:24:13.805527 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:13.805497 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jjq7d_36a1e166-0e33-4883-8711-cbbba5eb371c/kube-rbac-proxy/0.log" May 11 21:24:13.823241 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:13.823219 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jjq7d_36a1e166-0e33-4883-8711-cbbba5eb371c/exporter/0.log" May 11 21:24:13.841775 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:13.841753 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jjq7d_36a1e166-0e33-4883-8711-cbbba5eb371c/extractor/0.log" May 11 21:24:16.039942 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:16.039908 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-755c95f69f-vwxsj_4fe0c3de-20cd-4ad3-83a8-876b1eebe765/manager/0.log" May 11 21:24:16.181405 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:16.181379 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qcz2k/perf-node-gather-daemonset-z2hf5" May 11 21:24:17.194306 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:17.194276 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-68d9b68cf6-vdhg2_6d48888a-4be7-4470-b095-efad539e3b56/manager/0.log" May 11 21:24:21.832147 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:21.832114 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-649b864788-qhtlj_d7e41df8-69e8-4481-9aa4-0456bce8d7df/kube-storage-version-migrator-operator/1.log" May 11 21:24:21.832900 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:21.832881 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-649b864788-qhtlj_d7e41df8-69e8-4481-9aa4-0456bce8d7df/kube-storage-version-migrator-operator/0.log" May 11 21:24:22.963585 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:22.963560 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4bgl_96cb7513-d136-4d23-90a5-47ea1604bb7b/kube-multus-additional-cni-plugins/0.log" May 11 21:24:22.983020 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:22.982992 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4bgl_96cb7513-d136-4d23-90a5-47ea1604bb7b/egress-router-binary-copy/0.log" May 11 21:24:23.000943 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:23.000915 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4bgl_96cb7513-d136-4d23-90a5-47ea1604bb7b/cni-plugins/0.log" May 11 21:24:23.019464 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:23.019438 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4bgl_96cb7513-d136-4d23-90a5-47ea1604bb7b/bond-cni-plugin/0.log" May 11 21:24:23.041752 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:23.041714 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4bgl_96cb7513-d136-4d23-90a5-47ea1604bb7b/routeoverride-cni/0.log" May 11 21:24:23.059733 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:23.059717 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4bgl_96cb7513-d136-4d23-90a5-47ea1604bb7b/whereabouts-cni-bincopy/0.log" May 11 21:24:23.078535 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:23.078517 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4bgl_96cb7513-d136-4d23-90a5-47ea1604bb7b/whereabouts-cni/0.log" May 11 21:24:23.261916 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:23.261844 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zvbm7_037f8bf8-dffb-4ab0-806a-d440b0092789/kube-multus/0.log" May 11 21:24:23.285871 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:23.285850 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2ccqq_3be5f296-2151-4f3e-b028-c72728d855da/network-metrics-daemon/0.log" May 11 21:24:23.303541 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:23.303519 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2ccqq_3be5f296-2151-4f3e-b028-c72728d855da/kube-rbac-proxy/0.log" May 11 21:24:24.444821 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:24.444793 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-controller/0.log" May 11 21:24:24.464760 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:24.464732 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/0.log" May 11 21:24:24.473524 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:24.473500 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovn-acl-logging/1.log" May 11 21:24:24.489753 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:24.489733 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/kube-rbac-proxy-node/0.log" May 11 21:24:24.509730 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:24.509707 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/kube-rbac-proxy-ovn-metrics/0.log" May 11 21:24:24.528332 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:24.528313 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/northd/0.log" May 11 21:24:24.549661 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:24.549639 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/nbdb/0.log" May 11 21:24:24.570595 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:24.570577 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/sbdb/0.log" May 11 21:24:24.662881 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:24.662854 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-knpxn_9c60e645-398a-4781-9df4-1e5322dfe01e/ovnkube-controller/0.log" May 11 21:24:26.043588 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:26.043561 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-6859b67c86-hzqqm_aa10b731-c86e-4be7-b230-1ef6c613b38f/check-endpoints/0.log" May 11 21:24:26.103619 ip-10-0-135-190 kubenswrapper[2562]: I0511 21:24:26.103591 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xw5qw_bae2e16d-3454-4522-88aa-1afafb2e9cb1/network-check-target-container/0.log"