Apr 17 15:17:23.426402 ip-10-0-133-75 systemd[1]: Starting Kubernetes Kubelet... Apr 17 15:17:23.874683 ip-10-0-133-75 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 15:17:23.874683 ip-10-0-133-75 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 15:17:23.874683 ip-10-0-133-75 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 15:17:23.874683 ip-10-0-133-75 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 15:17:23.874683 ip-10-0-133-75 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 15:17:23.878625 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.878501 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 15:17:23.883180 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883162 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:23.883180 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883181 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883186 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883190 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883193 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883196 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883200 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883203 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883205 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883209 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883211 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883214 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883217 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883220 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883226 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883229 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883232 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883235 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883237 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883240 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883242 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:23.883252 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883245 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883247 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883250 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883253 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883256 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883259 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883262 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883265 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883268 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883270 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883273 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883276 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883278 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883281 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883283 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883286 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883288 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883291 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883293 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883297 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:23.883767 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883299 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883302 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883304 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883307 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883309 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883312 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883315 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883317 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883320 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883322 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883326 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883329 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883331 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883334 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883338 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883341 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883344 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883347 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883349 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:23.884280 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883352 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883355 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883357 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883360 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883371 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883374 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883377 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883379 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883384 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883388 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883392 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883395 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883397 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883400 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883402 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883405 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883408 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883412 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883416 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:23.884736 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883419 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883422 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883425 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883427 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883430 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883432 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883435 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883865 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883872 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883876 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883879 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883883 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883886 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883889 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883892 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883895 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883898 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883901 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883903 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:23.885203 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883906 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883910 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883914 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883917 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883920 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883923 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883925 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883928 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883930 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883933 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883935 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883938 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883941 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883943 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883946 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883948 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883951 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883954 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883956 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883959 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:23.885657 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883962 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883965 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883968 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883970 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883973 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883976 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883978 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883981 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883984 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883986 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883989 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883991 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883994 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883997 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.883999 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884002 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884004 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884007 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884010 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884012 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:23.886208 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884015 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884018 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884020 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884022 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884027 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884030 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884033 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884036 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884039 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884041 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884064 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884068 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884070 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884074 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884077 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884080 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884083 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884086 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884089 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884092 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:23.886702 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884095 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884098 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884100 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884103 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884106 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884108 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884111 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884113 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884116 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884118 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884122 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884124 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884127 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884129 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884204 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884211 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884218 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884223 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884227 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884231 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884235 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 15:17:23.887217 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884239 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884243 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884246 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884250 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884255 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884259 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884262 2576 flags.go:64] FLAG: --cgroup-root="" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884265 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884268 2576 flags.go:64] FLAG: --client-ca-file="" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884271 2576 flags.go:64] FLAG: --cloud-config="" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884274 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884277 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884283 2576 flags.go:64] FLAG: --cluster-domain="" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884286 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884290 2576 flags.go:64] FLAG: --config-dir="" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884293 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884296 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884301 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884304 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884307 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884310 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884314 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884318 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884321 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884324 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 15:17:23.887759 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884328 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884332 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884335 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884339 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884342 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884345 2576 flags.go:64] FLAG: --enable-server="true" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884348 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884356 2576 flags.go:64] FLAG: --event-burst="100" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884359 2576 flags.go:64] FLAG: --event-qps="50" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884362 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884365 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884368 2576 flags.go:64] FLAG: --eviction-hard="" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884373 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884376 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884379 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884383 2576 flags.go:64] FLAG: --eviction-soft="" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884385 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884388 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884392 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884394 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884398 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884401 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884404 2576 flags.go:64] FLAG: --feature-gates="" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884407 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884410 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 15:17:23.888384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884414 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884417 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884420 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884423 2576 flags.go:64] FLAG: --help="false" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884426 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-133-75.ec2.internal" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884430 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884433 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884436 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884440 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884443 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884446 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884449 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884452 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884455 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884458 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884461 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884464 2576 flags.go:64] FLAG: --kube-reserved="" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884467 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884470 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884473 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884476 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884479 2576 flags.go:64] FLAG: --lock-file="" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884482 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884485 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 15:17:23.888995 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884488 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884494 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884497 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884500 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884503 2576 flags.go:64] FLAG: --logging-format="text" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884506 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884509 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884512 2576 flags.go:64] FLAG: --manifest-url="" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884515 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884520 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884523 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884531 2576 flags.go:64] FLAG: --max-pods="110" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884534 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884537 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884541 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884544 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884547 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884550 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884553 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884561 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884564 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884568 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884571 2576 flags.go:64] FLAG: --pod-cidr="" Apr 17 15:17:23.889589 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884574 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884581 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884584 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884588 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884591 2576 flags.go:64] FLAG: --port="10250" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884594 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884597 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-038589ac90a1ccd2c" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884600 2576 flags.go:64] FLAG: --qos-reserved="" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884604 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884607 2576 flags.go:64] FLAG: --register-node="true" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884610 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884612 2576 flags.go:64] FLAG: --register-with-taints="" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884620 2576 flags.go:64] FLAG: --registry-burst="10" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884624 2576 flags.go:64] FLAG: --registry-qps="5" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884626 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884629 2576 flags.go:64] FLAG: --reserved-memory="" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884633 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884636 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884639 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884642 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884646 2576 flags.go:64] FLAG: --runonce="false" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884649 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884652 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884655 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884677 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884681 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 15:17:23.890159 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884684 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884690 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884693 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884696 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884700 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884703 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884705 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884709 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884712 2576 flags.go:64] FLAG: --system-cgroups="" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884715 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884724 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884726 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884729 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884735 2576 flags.go:64] FLAG: --tls-min-version="" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884738 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884741 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884744 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884747 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884750 2576 flags.go:64] FLAG: --v="2" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884763 2576 flags.go:64] FLAG: --version="false" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884767 2576 flags.go:64] FLAG: --vmodule="" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884772 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.884775 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884877 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:23.890794 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884881 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884884 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884887 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884890 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884893 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884896 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884899 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884902 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884905 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884909 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884912 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884915 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884917 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884920 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884922 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884925 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884928 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884930 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884934 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884937 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:23.891780 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884940 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884942 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884945 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884948 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884950 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884953 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884956 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884958 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884961 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884964 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884967 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884969 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884972 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884975 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884977 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884980 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884982 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884985 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884987 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.884990 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:23.892811 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885000 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885004 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885007 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885010 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885012 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885015 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885017 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885020 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885023 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885025 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885030 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885032 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885035 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885038 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885042 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885059 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885062 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885065 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885068 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885070 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:23.893701 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885073 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885075 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885078 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885081 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885083 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885086 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885088 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885091 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885094 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885097 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885099 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885102 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885105 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885115 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885118 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885120 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885123 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885125 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885128 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:23.894452 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885131 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885134 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885136 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885140 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885143 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.885147 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.885782 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.892912 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.892934 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893008 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893016 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893021 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893026 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893031 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893036 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:23.894928 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893041 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893063 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893067 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893071 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893076 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893082 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893089 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893093 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893097 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893101 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893106 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893110 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893115 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893119 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893124 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893128 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893133 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893138 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893142 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893146 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:23.895398 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893151 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893156 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893160 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893164 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893169 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893175 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893180 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893184 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893189 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893193 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893197 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893201 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893205 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893209 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893213 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893217 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893222 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893226 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893230 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893234 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:23.896195 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893238 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893242 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893246 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893251 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893255 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893259 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893264 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893268 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893272 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893277 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893283 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893290 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893295 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893300 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893305 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893309 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893314 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893318 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893323 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:23.896929 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893327 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893331 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893335 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893340 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893344 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893348 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893352 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893357 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893361 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893365 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893368 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893373 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893377 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893381 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893385 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893389 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893394 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893398 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893402 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:23.897457 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893407 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893411 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.893420 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893582 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893591 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893596 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893601 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893606 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893610 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893614 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893619 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893623 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893627 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893632 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893636 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 15:17:23.898109 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893641 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893645 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893649 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893653 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893657 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893662 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893667 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893671 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893676 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893680 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893684 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893688 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893692 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893696 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893700 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893704 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893710 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893715 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893719 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893724 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 15:17:23.898603 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893728 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893733 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893738 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893742 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893746 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893751 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893756 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893760 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893764 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893768 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893773 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893778 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893784 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893790 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893794 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893799 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893803 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893807 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893812 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 15:17:23.899222 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893816 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893820 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893824 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893829 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893833 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893837 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893841 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893844 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893849 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893853 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893857 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893861 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893865 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893869 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893874 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893878 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893882 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893886 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893890 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893894 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 15:17:23.899748 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893898 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893902 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893906 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893911 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893915 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893920 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893924 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893928 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893932 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893936 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893940 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893944 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893948 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893952 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:23.893957 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.893964 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 15:17:23.900266 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.894955 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 15:17:23.900694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.898391 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 15:17:23.900694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.899403 2576 server.go:1019] "Starting client certificate rotation" Apr 17 15:17:23.900694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.899503 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 15:17:23.900694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.900372 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 15:17:23.923810 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.923781 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 15:17:23.926893 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.926489 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 15:17:23.940344 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.940321 2576 log.go:25] "Validated CRI v1 runtime API" Apr 17 15:17:23.945769 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.945749 2576 log.go:25] "Validated CRI v1 image API" Apr 17 15:17:23.947310 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.947291 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 15:17:23.947911 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.947891 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 15:17:23.952442 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.952423 2576 fs.go:135] Filesystem UUIDs: map[25b16e9b-0807-4ff7-bd11-5dce510f718e:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 93292b41-a384-4d63-ac40-6a8d9f119596:/dev/nvme0n1p3] Apr 17 15:17:23.952501 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.952443 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 15:17:23.959326 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.959212 2576 manager.go:217] Machine: {Timestamp:2026-04-17 15:17:23.956926786 +0000 UTC m=+0.413108875 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100674 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e0a23c547d0b34ee25bfe297b3406 SystemUUID:ec2e0a23-c547-d0b3-4ee2-5bfe297b3406 BootID:02608d57-1c6f-4cda-a1f8-600dc20aa703 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f4:d1:42:ff:77 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f4:d1:42:ff:77 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ea:51:2a:b4:07:74 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 15:17:23.959326 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.959322 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 15:17:23.959485 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.959472 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 15:17:23.960575 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.960550 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 15:17:23.960718 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.960578 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-75.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 15:17:23.960762 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.960728 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 15:17:23.960762 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.960737 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 15:17:23.960762 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.960750 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 15:17:23.961571 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.961560 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 15:17:23.962283 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.962273 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 15:17:23.962406 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.962397 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 15:17:23.964827 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.964816 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 17 15:17:23.964870 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.964836 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 15:17:23.964870 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.964852 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 15:17:23.964870 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.964861 2576 kubelet.go:397] "Adding apiserver pod source" Apr 17 15:17:23.964870 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.964869 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 15:17:23.966032 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.966020 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 15:17:23.966085 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.966039 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 15:17:23.967132 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.967113 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7mmxv" Apr 17 15:17:23.969489 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.969471 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 15:17:23.970851 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.970833 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 15:17:23.972578 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972562 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 15:17:23.972650 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972583 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 15:17:23.972650 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972592 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 15:17:23.972650 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972599 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 15:17:23.972650 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972608 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 15:17:23.972650 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972617 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 15:17:23.972650 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972626 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 15:17:23.972650 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972649 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 15:17:23.972881 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972660 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 15:17:23.972881 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972671 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 15:17:23.972881 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972689 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 15:17:23.972881 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.972703 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 15:17:23.975338 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.975313 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 15:17:23.975408 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.975340 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 15:17:23.975585 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.975554 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7mmxv" Apr 17 15:17:23.978375 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:23.978346 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 15:17:23.978458 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:23.978397 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-75.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 15:17:23.980324 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.980309 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 15:17:23.980385 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.980349 2576 server.go:1295] "Started kubelet" Apr 17 15:17:23.981335 ip-10-0-133-75 systemd[1]: Started Kubernetes Kubelet. Apr 17 15:17:23.981463 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.981300 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 15:17:23.981463 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.981364 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 15:17:23.981463 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.981324 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 15:17:23.982547 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.982534 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 17 15:17:23.982855 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.982839 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 15:17:23.986918 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.986897 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 15:17:23.987462 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.987447 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 15:17:23.988010 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.987992 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 15:17:23.988010 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.987993 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 15:17:23.988153 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.988020 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 15:17:23.988153 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.988106 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 17 15:17:23.988153 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.988118 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 17 15:17:23.988378 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:23.988358 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:23.989997 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.989935 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:23.990324 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.990304 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-75.ec2.internal" not found Apr 17 15:17:23.992221 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:23.992199 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-75.ec2.internal\" not found" node="ip-10-0-133-75.ec2.internal" Apr 17 15:17:23.992687 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.992669 2576 factory.go:55] Registering systemd factory Apr 17 15:17:23.992776 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.992690 2576 factory.go:223] Registration of the systemd container factory successfully Apr 17 15:17:23.992953 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.992939 2576 factory.go:153] Registering CRI-O factory Apr 17 15:17:23.992953 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.992957 2576 factory.go:223] Registration of the crio container factory successfully Apr 17 15:17:23.993091 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.993077 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 15:17:23.993150 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.993104 2576 factory.go:103] Registering Raw factory Apr 17 15:17:23.993150 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.993115 2576 manager.go:1196] Started watching for new ooms in manager Apr 17 15:17:23.993578 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:23.993565 2576 manager.go:319] Starting recovery of all containers Apr 17 15:17:23.993811 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:23.993765 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 15:17:24.004631 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.004611 2576 manager.go:324] Recovery completed Apr 17 15:17:24.008667 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.008651 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-75.ec2.internal" not found Apr 17 15:17:24.009665 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.009652 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:24.012421 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.012406 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:24.012499 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.012438 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:24.012499 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.012453 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:24.012966 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.012953 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 15:17:24.012966 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.012965 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 15:17:24.013030 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.012982 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 15:17:24.016217 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.016202 2576 policy_none.go:49] "None policy: Start" Apr 17 15:17:24.016288 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.016232 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 15:17:24.016288 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.016247 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.059261 2576 manager.go:341] "Starting Device Plugin manager" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.059298 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.059308 2576 server.go:85] "Starting device plugin registration server" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.059612 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.059625 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.059739 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.059831 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.059845 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.060351 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.060384 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:24.079611 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.068530 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-75.ec2.internal" not found Apr 17 15:17:24.133363 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.133266 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 15:17:24.134687 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.134669 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 15:17:24.134774 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.134697 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 15:17:24.134774 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.134719 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 15:17:24.134774 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.134725 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 15:17:24.134892 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.134812 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 15:17:24.137856 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.137837 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:24.160511 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.160485 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:24.161436 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.161422 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:24.161509 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.161453 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:24.161509 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.161466 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:24.161509 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.161489 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.169566 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.169541 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.169653 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.169569 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-75.ec2.internal\": node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:24.184055 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.184032 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:24.235647 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.235605 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal"] Apr 17 15:17:24.235764 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.235720 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:24.237274 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.237258 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:24.237346 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.237288 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:24.237346 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.237298 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:24.239617 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.239604 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:24.239776 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.239763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.239776 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.239792 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:24.240330 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.240311 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:24.240419 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.240345 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:24.240419 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.240359 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:24.240419 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.240384 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:24.240419 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.240411 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:24.240549 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.240423 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:24.242666 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.242648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.242757 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.242674 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 15:17:24.243416 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.243399 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientMemory" Apr 17 15:17:24.243483 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.243434 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 15:17:24.243483 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.243444 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientPID" Apr 17 15:17:24.265848 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.265820 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-75.ec2.internal\" not found" node="ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.270244 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.270225 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-75.ec2.internal\" not found" node="ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.284815 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.284791 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:24.291270 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.291251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.291335 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.291279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.291335 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.291303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5401552a10b9bd31fa1f4a18dcace9bb-config\") pod \"kube-apiserver-proxy-ip-10-0-133-75.ec2.internal\" (UID: \"5401552a10b9bd31fa1f4a18dcace9bb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.385813 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.385745 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:24.392167 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.392143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.392221 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.392174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.392221 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.392193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5401552a10b9bd31fa1f4a18dcace9bb-config\") pod \"kube-apiserver-proxy-ip-10-0-133-75.ec2.internal\" (UID: \"5401552a10b9bd31fa1f4a18dcace9bb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.392282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.392244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.392282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.392244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5401552a10b9bd31fa1f4a18dcace9bb-config\") pod \"kube-apiserver-proxy-ip-10-0-133-75.ec2.internal\" (UID: \"5401552a10b9bd31fa1f4a18dcace9bb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.392341 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.392244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.486632 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.486588 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:24.568124 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.568093 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.572663 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.572646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 17 15:17:24.587121 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.587088 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:24.687622 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.687532 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:24.788015 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.787967 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:24.803965 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.803927 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:24.888915 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.888878 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:24.899184 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.899163 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 15:17:24.899321 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.899302 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 15:17:24.899369 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.899324 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 15:17:24.899369 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.899323 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 15:17:24.977987 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.977798 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 15:12:23 +0000 UTC" deadline="2028-01-10 12:45:56.984991774 +0000 UTC" Apr 17 15:17:24.977987 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.977984 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15189h28m32.00701267s" Apr 17 15:17:24.987998 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:24.987971 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 15:17:24.988970 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:24.988952 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:25.007206 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.007180 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 15:17:25.031134 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.031099 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-s6n5l" Apr 17 15:17:25.037320 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.037297 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-s6n5l" Apr 17 15:17:25.084158 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:25.084115 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5401552a10b9bd31fa1f4a18dcace9bb.slice/crio-25138beea374544cd6de808ff0615114675106305f53e4a306452711573dbdc4 WatchSource:0}: Error finding container 25138beea374544cd6de808ff0615114675106305f53e4a306452711573dbdc4: Status 404 returned error can't find the container with id 25138beea374544cd6de808ff0615114675106305f53e4a306452711573dbdc4 Apr 17 15:17:25.084691 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:25.084671 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode52f89589d514c455852c1cdd49a71bd.slice/crio-29d56b4e4c0bccb2c139f010a7bf85292e08dc3b6a4a23377dc961b2d49bb299 WatchSource:0}: Error finding container 29d56b4e4c0bccb2c139f010a7bf85292e08dc3b6a4a23377dc961b2d49bb299: Status 404 returned error can't find the container with id 29d56b4e4c0bccb2c139f010a7bf85292e08dc3b6a4a23377dc961b2d49bb299 Apr 17 15:17:25.089835 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:25.089574 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:25.089835 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.089676 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:17:25.137940 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.137873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" event={"ID":"5401552a10b9bd31fa1f4a18dcace9bb","Type":"ContainerStarted","Data":"25138beea374544cd6de808ff0615114675106305f53e4a306452711573dbdc4"} Apr 17 15:17:25.138789 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.138770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" event={"ID":"e52f89589d514c455852c1cdd49a71bd","Type":"ContainerStarted","Data":"29d56b4e4c0bccb2c139f010a7bf85292e08dc3b6a4a23377dc961b2d49bb299"} Apr 17 15:17:25.190135 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:25.190088 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 17 15:17:25.264156 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.264072 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:25.288476 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.288440 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 17 15:17:25.296302 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.296273 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 15:17:25.297387 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.297371 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 17 15:17:25.306177 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.306158 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 15:17:25.966518 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.966474 2576 apiserver.go:52] "Watching apiserver" Apr 17 15:17:25.974519 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.974483 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 15:17:25.974899 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.974867 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-2vktv","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz","openshift-image-registry/node-ca-wqrkq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal","openshift-multus/multus-vndw6","openshift-ovn-kubernetes/ovnkube-node-7p8ns","kube-system/konnectivity-agent-jgfqk","kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal","openshift-cluster-node-tuning-operator/tuned-7sgdp","openshift-dns/node-resolver-bpw5n","openshift-multus/multus-additional-cni-plugins-ffhn2","openshift-multus/network-metrics-daemon-82cq8","openshift-network-diagnostics/network-check-target-n2x89"] Apr 17 15:17:25.977461 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.977432 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:25.979566 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.979540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:25.979686 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.979576 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-znk6p\"" Apr 17 15:17:25.979773 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.979754 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 15:17:25.979962 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.979875 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 15:17:25.981658 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.981637 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 15:17:25.981793 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.981770 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xm52c\"" Apr 17 15:17:25.981882 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.981830 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 15:17:25.982070 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.982040 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 15:17:25.984542 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.984522 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:25.984642 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.984616 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vndw6" Apr 17 15:17:25.987730 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.987214 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:25.987730 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.987240 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 15:17:25.987730 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.987240 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 15:17:25.987730 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.987546 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 15:17:25.987730 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.987640 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 15:17:25.987730 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.987668 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 15:17:25.987730 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.987706 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 15:17:25.988216 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.987771 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 15:17:25.988216 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.987867 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-f72dc\"" Apr 17 15:17:25.988216 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.988000 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q5mqj\"" Apr 17 15:17:25.989840 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.989751 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 15:17:25.989962 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.989897 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 15:17:25.990060 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.990025 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:25.993446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.991925 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 15:17:25.993446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.992133 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pr6fw\"" Apr 17 15:17:25.993446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.992178 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 15:17:25.993446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.992686 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 15:17:25.993446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.992865 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 15:17:25.993446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.993006 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 15:17:25.993446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.993292 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7xl9g\"" Apr 17 15:17:25.993446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.993387 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 15:17:25.997528 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.997503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:25.997797 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.997778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:25.999752 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.999733 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mwlrr\"" Apr 17 15:17:25.999860 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.999792 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:17:25.999956 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.999940 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 15:17:26.000020 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:25.999965 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 15:17:26.000104 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000026 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 15:17:26.000164 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000111 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.000246 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000229 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:17:26.000309 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000257 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lstsb\"" Apr 17 15:17:26.000507 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a00974bb-abc9-4285-909c-842f9c69b1f3-host\") pod \"node-ca-wqrkq\" (UID: \"a00974bb-abc9-4285-909c-842f9c69b1f3\") " pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:26.000613 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-cni-dir\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.000613 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-hostroot\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.000613 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-run-ovn\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.000749 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-cnibin\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.000749 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-socket-dir-parent\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.000749 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-var-lib-cni-bin\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.000749 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-run-netns\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.000749 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-log-socket\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.000967 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqw7j\" (UniqueName: \"kubernetes.io/projected/1e3cfe5e-0c86-4d14-ac14-7390274f338b-kube-api-access-qqw7j\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.000967 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a00974bb-abc9-4285-909c-842f9c69b1f3-serviceca\") pod \"node-ca-wqrkq\" (UID: \"a00974bb-abc9-4285-909c-842f9c69b1f3\") " pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:26.000967 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-var-lib-cni-multus\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.000967 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-systemd-units\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.000967 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.001282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd8qz\" (UniqueName: \"kubernetes.io/projected/e70e8fd7-f8f2-4303-8371-1696921c6746-kube-api-access-pd8qz\") pod \"node-resolver-bpw5n\" (UID: \"e70e8fd7-f8f2-4303-8371-1696921c6746\") " pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:26.001282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.000982 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:26.001282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-device-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.001282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-etc-selinux\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.001282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001083 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-system-cni-dir\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.001282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-os-release\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.001282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-conf-dir\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.001282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-run-multus-certs\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.001282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e70e8fd7-f8f2-4303-8371-1696921c6746-tmp-dir\") pod \"node-resolver-bpw5n\" (UID: \"e70e8fd7-f8f2-4303-8371-1696921c6746\") " pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:26.001282 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-run-k8s-cni-cncf-io\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-daemon-config\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-var-lib-openvswitch\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-run-ovn-kubernetes\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001395 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e3cfe5e-0c86-4d14-ac14-7390274f338b-ovnkube-script-lib\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4299c8c-3050-4ce9-9766-13f14ff297a7-agent-certs\") pod \"konnectivity-agent-jgfqk\" (UID: \"f4299c8c-3050-4ce9-9766-13f14ff297a7\") " pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdkf7\" (UniqueName: \"kubernetes.io/projected/1bcb7f53-9455-4609-b5aa-4190a92c8d15-kube-api-access-rdkf7\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-sys-fs\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-var-lib-kubelet\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-etc-kubernetes\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9xvb\" (UniqueName: \"kubernetes.io/projected/a89cc04d-c377-4ac2-9120-63ebc1ca2990-kube-api-access-h9xvb\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-slash\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-run-systemd\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a89cc04d-c377-4ac2-9120-63ebc1ca2990-cni-binary-copy\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001794 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-kubelet\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.001901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-etc-openvswitch\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-run-openvswitch\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-node-log\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-cni-bin\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.001969 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e3cfe5e-0c86-4d14-ac14-7390274f338b-ovnkube-config\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e70e8fd7-f8f2-4303-8371-1696921c6746-hosts-file\") pod \"node-resolver-bpw5n\" (UID: \"e70e8fd7-f8f2-4303-8371-1696921c6746\") " pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-registration-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tcq\" (UniqueName: \"kubernetes.io/projected/a00974bb-abc9-4285-909c-842f9c69b1f3-kube-api-access-g4tcq\") pod \"node-ca-wqrkq\" (UID: \"a00974bb-abc9-4285-909c-842f9c69b1f3\") " pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-run-netns\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-cni-netd\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e3cfe5e-0c86-4d14-ac14-7390274f338b-env-overrides\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e3cfe5e-0c86-4d14-ac14-7390274f338b-ovn-node-metrics-cert\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4299c8c-3050-4ce9-9766-13f14ff297a7-konnectivity-ca\") pod \"konnectivity-agent-jgfqk\" (UID: \"f4299c8c-3050-4ce9-9766-13f14ff297a7\") " pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002260 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-socket-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002322 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002364 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 15:17:26.002656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002393 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kpt76\"" Apr 17 15:17:26.003326 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.002885 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:26.003326 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.002983 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:26.005274 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.005253 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:26.005377 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.005317 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:26.038131 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.038095 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 15:12:25 +0000 UTC" deadline="2028-01-30 12:31:20.183062878 +0000 UTC" Apr 17 15:17:26.038131 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.038128 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15669h13m54.144938504s" Apr 17 15:17:26.089897 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.089860 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 15:17:26.102464 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-var-lib-kubelet\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.102642 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-etc-kubernetes\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.102642 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-run-systemd\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.102642 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-run-systemd\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.102642 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-var-lib-kubelet\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.102642 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-etc-kubernetes\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.102642 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq48j\" (UniqueName: \"kubernetes.io/projected/6e474365-0ff7-4228-b6b7-3f49bc17a45b-kube-api-access-dq48j\") pod \"iptables-alerter-2vktv\" (UID: \"6e474365-0ff7-4228-b6b7-3f49bc17a45b\") " pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:26.102642 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-kubelet\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4299c8c-3050-4ce9-9766-13f14ff297a7-konnectivity-ca\") pod \"konnectivity-agent-jgfqk\" (UID: \"f4299c8c-3050-4ce9-9766-13f14ff297a7\") " pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e474365-0ff7-4228-b6b7-3f49bc17a45b-host-slash\") pod \"iptables-alerter-2vktv\" (UID: \"6e474365-0ff7-4228-b6b7-3f49bc17a45b\") " pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-kubelet\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102714 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqnvd\" (UniqueName: \"kubernetes.io/projected/fc69e676-8342-4380-a1ba-56fbb970d9d9-kube-api-access-bqnvd\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tcq\" (UniqueName: \"kubernetes.io/projected/a00974bb-abc9-4285-909c-842f9c69b1f3-kube-api-access-g4tcq\") pod \"node-ca-wqrkq\" (UID: \"a00974bb-abc9-4285-909c-842f9c69b1f3\") " pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-run-netns\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e3cfe5e-0c86-4d14-ac14-7390274f338b-env-overrides\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-system-cni-dir\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-var-lib-kubelet\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102887 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03062b60-45de-4e91-92e7-3959d5322bd1-tmp\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-run-netns\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a00974bb-abc9-4285-909c-842f9c69b1f3-host\") pod \"node-ca-wqrkq\" (UID: \"a00974bb-abc9-4285-909c-842f9c69b1f3\") " pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:26.102975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.102947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a00974bb-abc9-4285-909c-842f9c69b1f3-host\") pod \"node-ca-wqrkq\" (UID: \"a00974bb-abc9-4285-909c-842f9c69b1f3\") " pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-hostroot\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-hostroot\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-modprobe-d\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-sys\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kknjw\" (UniqueName: \"kubernetes.io/projected/03062b60-45de-4e91-92e7-3959d5322bd1-kube-api-access-kknjw\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-cnibin\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-run-netns\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z72mk\" (UniqueName: \"kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk\") pod \"network-check-target-n2x89\" (UID: \"7400fb35-d1c0-4009-bdc4-483256d99f9d\") " pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-cnibin\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a00974bb-abc9-4285-909c-842f9c69b1f3-serviceca\") pod \"node-ca-wqrkq\" (UID: \"a00974bb-abc9-4285-909c-842f9c69b1f3\") " pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-run-netns\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-var-lib-cni-multus\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f4299c8c-3050-4ce9-9766-13f14ff297a7-konnectivity-ca\") pod \"konnectivity-agent-jgfqk\" (UID: \"f4299c8c-3050-4ce9-9766-13f14ff297a7\") " pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-device-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-var-lib-cni-multus\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-system-cni-dir\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.103610 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e3cfe5e-0c86-4d14-ac14-7390274f338b-env-overrides\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-device-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-os-release\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-system-cni-dir\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-var-lib-openvswitch\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-cnibin\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-os-release\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6e474365-0ff7-4228-b6b7-3f49bc17a45b-iptables-alerter-script\") pod \"iptables-alerter-2vktv\" (UID: \"6e474365-0ff7-4228-b6b7-3f49bc17a45b\") " pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103511 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-var-lib-openvswitch\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-kubernetes\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-etc-openvswitch\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-cni-bin\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103642 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-etc-openvswitch\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-cni-bin\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103646 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a00974bb-abc9-4285-909c-842f9c69b1f3-serviceca\") pod \"node-ca-wqrkq\" (UID: \"a00974bb-abc9-4285-909c-842f9c69b1f3\") " pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9xvb\" (UniqueName: \"kubernetes.io/projected/a89cc04d-c377-4ac2-9120-63ebc1ca2990-kube-api-access-h9xvb\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.104458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-slash\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-kubelet-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-sysctl-conf\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-lib-modules\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-slash\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a89cc04d-c377-4ac2-9120-63ebc1ca2990-cni-binary-copy\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-node-log\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e3cfe5e-0c86-4d14-ac14-7390274f338b-ovnkube-config\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc69e676-8342-4380-a1ba-56fbb970d9d9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-node-log\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc69e676-8342-4380-a1ba-56fbb970d9d9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-systemd\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.103994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e70e8fd7-f8f2-4303-8371-1696921c6746-hosts-file\") pod \"node-resolver-bpw5n\" (UID: \"e70e8fd7-f8f2-4303-8371-1696921c6746\") " pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-registration-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-cni-netd\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e3cfe5e-0c86-4d14-ac14-7390274f338b-ovn-node-metrics-cert\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e70e8fd7-f8f2-4303-8371-1696921c6746-hosts-file\") pod \"node-resolver-bpw5n\" (UID: \"e70e8fd7-f8f2-4303-8371-1696921c6746\") " pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:26.105222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-registration-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-os-release\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-socket-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-cni-netd\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-cni-dir\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-run-ovn\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-socket-dir-parent\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-var-lib-cni-bin\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-log-socket\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-run-ovn\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqw7j\" (UniqueName: \"kubernetes.io/projected/1e3cfe5e-0c86-4d14-ac14-7390274f338b-kube-api-access-qqw7j\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-socket-dir\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc69e676-8342-4380-a1ba-56fbb970d9d9-cni-binary-copy\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-log-socket\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-socket-dir-parent\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-var-lib-cni-bin\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.105991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-sysctl-d\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-systemd-units\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a89cc04d-c377-4ac2-9120-63ebc1ca2990-cni-binary-copy\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104481 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-systemd-units\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e3cfe5e-0c86-4d14-ac14-7390274f338b-ovnkube-config\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104516 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtw7\" (UniqueName: \"kubernetes.io/projected/41bb8b03-e874-455f-8416-b76d91f0f117-kube-api-access-mbtw7\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd8qz\" (UniqueName: \"kubernetes.io/projected/e70e8fd7-f8f2-4303-8371-1696921c6746-kube-api-access-pd8qz\") pod \"node-resolver-bpw5n\" (UID: \"e70e8fd7-f8f2-4303-8371-1696921c6746\") " pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-etc-selinux\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-conf-dir\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-run-multus-certs\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-run-ovn-kubernetes\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-etc-selinux\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4299c8c-3050-4ce9-9766-13f14ff297a7-agent-certs\") pod \"konnectivity-agent-jgfqk\" (UID: \"f4299c8c-3050-4ce9-9766-13f14ff297a7\") " pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104698 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-conf-dir\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.106790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-sysconfig\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-run-multus-certs\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-run\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-host-run-ovn-kubernetes\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e70e8fd7-f8f2-4303-8371-1696921c6746-tmp-dir\") pod \"node-resolver-bpw5n\" (UID: \"e70e8fd7-f8f2-4303-8371-1696921c6746\") " pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-run-k8s-cni-cncf-io\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104855 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-daemon-config\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104871 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-cni-dir\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e3cfe5e-0c86-4d14-ac14-7390274f338b-ovnkube-script-lib\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.104960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a89cc04d-c377-4ac2-9120-63ebc1ca2990-host-run-k8s-cni-cncf-io\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.105012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-host\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.105041 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/03062b60-45de-4e91-92e7-3959d5322bd1-etc-tuned\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.105085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-run-openvswitch\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.105089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e70e8fd7-f8f2-4303-8371-1696921c6746-tmp-dir\") pod \"node-resolver-bpw5n\" (UID: \"e70e8fd7-f8f2-4303-8371-1696921c6746\") " pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.105134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdkf7\" (UniqueName: \"kubernetes.io/projected/1bcb7f53-9455-4609-b5aa-4190a92c8d15-kube-api-access-rdkf7\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.105158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-sys-fs\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.105272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e3cfe5e-0c86-4d14-ac14-7390274f338b-run-openvswitch\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.107629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.105272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1bcb7f53-9455-4609-b5aa-4190a92c8d15-sys-fs\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.108200 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.105902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a89cc04d-c377-4ac2-9120-63ebc1ca2990-multus-daemon-config\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.108200 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.106493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e3cfe5e-0c86-4d14-ac14-7390274f338b-ovnkube-script-lib\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.108200 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.108028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e3cfe5e-0c86-4d14-ac14-7390274f338b-ovn-node-metrics-cert\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.108294 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.108203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f4299c8c-3050-4ce9-9766-13f14ff297a7-agent-certs\") pod \"konnectivity-agent-jgfqk\" (UID: \"f4299c8c-3050-4ce9-9766-13f14ff297a7\") " pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:26.115666 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.115640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd8qz\" (UniqueName: \"kubernetes.io/projected/e70e8fd7-f8f2-4303-8371-1696921c6746-kube-api-access-pd8qz\") pod \"node-resolver-bpw5n\" (UID: \"e70e8fd7-f8f2-4303-8371-1696921c6746\") " pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:26.116118 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.116095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tcq\" (UniqueName: \"kubernetes.io/projected/a00974bb-abc9-4285-909c-842f9c69b1f3-kube-api-access-g4tcq\") pod \"node-ca-wqrkq\" (UID: \"a00974bb-abc9-4285-909c-842f9c69b1f3\") " pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:26.116328 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.116307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdkf7\" (UniqueName: \"kubernetes.io/projected/1bcb7f53-9455-4609-b5aa-4190a92c8d15-kube-api-access-rdkf7\") pod \"aws-ebs-csi-driver-node-xmhkz\" (UID: \"1bcb7f53-9455-4609-b5aa-4190a92c8d15\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.116424 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.116333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9xvb\" (UniqueName: \"kubernetes.io/projected/a89cc04d-c377-4ac2-9120-63ebc1ca2990-kube-api-access-h9xvb\") pod \"multus-vndw6\" (UID: \"a89cc04d-c377-4ac2-9120-63ebc1ca2990\") " pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.116424 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.116344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqw7j\" (UniqueName: \"kubernetes.io/projected/1e3cfe5e-0c86-4d14-ac14-7390274f338b-kube-api-access-qqw7j\") pod \"ovnkube-node-7p8ns\" (UID: \"1e3cfe5e-0c86-4d14-ac14-7390274f338b\") " pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.205441 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc69e676-8342-4380-a1ba-56fbb970d9d9-cni-binary-copy\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.205441 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.205671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-sysctl-d\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.205671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtw7\" (UniqueName: \"kubernetes.io/projected/41bb8b03-e874-455f-8416-b76d91f0f117-kube-api-access-mbtw7\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:26.205671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-sysconfig\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.205671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-run\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.205671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-sysctl-d\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.205671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-sysconfig\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.205671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.205671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-host\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/03062b60-45de-4e91-92e7-3959d5322bd1-etc-tuned\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-run\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq48j\" (UniqueName: \"kubernetes.io/projected/6e474365-0ff7-4228-b6b7-3f49bc17a45b-kube-api-access-dq48j\") pod \"iptables-alerter-2vktv\" (UID: \"6e474365-0ff7-4228-b6b7-3f49bc17a45b\") " pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-host\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e474365-0ff7-4228-b6b7-3f49bc17a45b-host-slash\") pod \"iptables-alerter-2vktv\" (UID: \"6e474365-0ff7-4228-b6b7-3f49bc17a45b\") " pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqnvd\" (UniqueName: \"kubernetes.io/projected/fc69e676-8342-4380-a1ba-56fbb970d9d9-kube-api-access-bqnvd\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-system-cni-dir\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-var-lib-kubelet\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e474365-0ff7-4228-b6b7-3f49bc17a45b-host-slash\") pod \"iptables-alerter-2vktv\" (UID: \"6e474365-0ff7-4228-b6b7-3f49bc17a45b\") " pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03062b60-45de-4e91-92e7-3959d5322bd1-tmp\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-system-cni-dir\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-modprobe-d\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-var-lib-kubelet\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-sys\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kknjw\" (UniqueName: \"kubernetes.io/projected/03062b60-45de-4e91-92e7-3959d5322bd1-kube-api-access-kknjw\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.205996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z72mk\" (UniqueName: \"kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk\") pod \"network-check-target-n2x89\" (UID: \"7400fb35-d1c0-4009-bdc4-483256d99f9d\") " pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fc69e676-8342-4380-a1ba-56fbb970d9d9-cni-binary-copy\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-cnibin\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-sys\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6e474365-0ff7-4228-b6b7-3f49bc17a45b-iptables-alerter-script\") pod \"iptables-alerter-2vktv\" (UID: \"6e474365-0ff7-4228-b6b7-3f49bc17a45b\") " pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-kubernetes\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.206126 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-sysctl-conf\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-lib-modules\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.206200 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs podName:41bb8b03-e874-455f-8416-b76d91f0f117 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:26.706175731 +0000 UTC m=+3.162357824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs") pod "network-metrics-daemon-82cq8" (UID: "41bb8b03-e874-455f-8416-b76d91f0f117") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-cnibin\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-lib-modules\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206275 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-kubernetes\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-modprobe-d\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc69e676-8342-4380-a1ba-56fbb970d9d9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc69e676-8342-4380-a1ba-56fbb970d9d9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-systemd\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.206694 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-sysctl-conf\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.207377 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/03062b60-45de-4e91-92e7-3959d5322bd1-etc-systemd\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.207377 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-os-release\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.207377 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fc69e676-8342-4380-a1ba-56fbb970d9d9-os-release\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.207377 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6e474365-0ff7-4228-b6b7-3f49bc17a45b-iptables-alerter-script\") pod \"iptables-alerter-2vktv\" (UID: \"6e474365-0ff7-4228-b6b7-3f49bc17a45b\") " pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:26.207377 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fc69e676-8342-4380-a1ba-56fbb970d9d9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.207377 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.206850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fc69e676-8342-4380-a1ba-56fbb970d9d9-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.209035 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.208990 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/03062b60-45de-4e91-92e7-3959d5322bd1-etc-tuned\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.209451 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.209431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03062b60-45de-4e91-92e7-3959d5322bd1-tmp\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.212244 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.212220 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:26.212351 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.212249 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:26.212351 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.212263 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z72mk for pod openshift-network-diagnostics/network-check-target-n2x89: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:26.212351 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.212343 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk podName:7400fb35-d1c0-4009-bdc4-483256d99f9d nodeName:}" failed. No retries permitted until 2026-04-17 15:17:26.712307854 +0000 UTC m=+3.168489954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z72mk" (UniqueName: "kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk") pod "network-check-target-n2x89" (UID: "7400fb35-d1c0-4009-bdc4-483256d99f9d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:26.214222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.214193 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kknjw\" (UniqueName: \"kubernetes.io/projected/03062b60-45de-4e91-92e7-3959d5322bd1-kube-api-access-kknjw\") pod \"tuned-7sgdp\" (UID: \"03062b60-45de-4e91-92e7-3959d5322bd1\") " pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.214222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.214200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqnvd\" (UniqueName: \"kubernetes.io/projected/fc69e676-8342-4380-a1ba-56fbb970d9d9-kube-api-access-bqnvd\") pod \"multus-additional-cni-plugins-ffhn2\" (UID: \"fc69e676-8342-4380-a1ba-56fbb970d9d9\") " pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.214396 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.214378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq48j\" (UniqueName: \"kubernetes.io/projected/6e474365-0ff7-4228-b6b7-3f49bc17a45b-kube-api-access-dq48j\") pod \"iptables-alerter-2vktv\" (UID: \"6e474365-0ff7-4228-b6b7-3f49bc17a45b\") " pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:26.214477 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.214461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtw7\" (UniqueName: \"kubernetes.io/projected/41bb8b03-e874-455f-8416-b76d91f0f117-kube-api-access-mbtw7\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:26.289213 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.289082 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bpw5n" Apr 17 15:17:26.297068 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.297023 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" Apr 17 15:17:26.305787 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.305764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vndw6" Apr 17 15:17:26.313590 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.313559 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2vktv" Apr 17 15:17:26.319120 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.319102 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wqrkq" Apr 17 15:17:26.326762 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.326739 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:26.332348 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.332327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:26.337898 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.337879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" Apr 17 15:17:26.343450 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.343426 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" Apr 17 15:17:26.465636 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.465597 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:26.707777 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.707746 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 15:17:26.710786 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.710752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:26.710930 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.710861 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:26.710930 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.710927 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs podName:41bb8b03-e874-455f-8416-b76d91f0f117 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:27.710911715 +0000 UTC m=+4.167093788 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs") pod "network-metrics-daemon-82cq8" (UID: "41bb8b03-e874-455f-8416-b76d91f0f117") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:26.746504 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:26.746472 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode70e8fd7_f8f2_4303_8371_1696921c6746.slice/crio-44bb5f19fffdd905e92b11f427eb10fc8157636666ddc352b4212cb96686030b WatchSource:0}: Error finding container 44bb5f19fffdd905e92b11f427eb10fc8157636666ddc352b4212cb96686030b: Status 404 returned error can't find the container with id 44bb5f19fffdd905e92b11f427eb10fc8157636666ddc352b4212cb96686030b Apr 17 15:17:26.747620 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:26.747595 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e474365_0ff7_4228_b6b7_3f49bc17a45b.slice/crio-e1ec291999f3c1bdcda50f374b1ae2feb2cf5e7f028efda8cb7a004f7f169fd1 WatchSource:0}: Error finding container e1ec291999f3c1bdcda50f374b1ae2feb2cf5e7f028efda8cb7a004f7f169fd1: Status 404 returned error can't find the container with id e1ec291999f3c1bdcda50f374b1ae2feb2cf5e7f028efda8cb7a004f7f169fd1 Apr 17 15:17:26.749436 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:26.749265 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc69e676_8342_4380_a1ba_56fbb970d9d9.slice/crio-402ed54f69a99bfd7eab2595391f0db0c4c455401498f397b889efd07ea99230 WatchSource:0}: Error finding container 402ed54f69a99bfd7eab2595391f0db0c4c455401498f397b889efd07ea99230: Status 404 returned error can't find the container with id 402ed54f69a99bfd7eab2595391f0db0c4c455401498f397b889efd07ea99230 Apr 17 15:17:26.752321 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:26.752295 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda89cc04d_c377_4ac2_9120_63ebc1ca2990.slice/crio-edb286605359bac902e2eeb5cff52714f29071c3ff6ee69bbe8f6c077c695b41 WatchSource:0}: Error finding container edb286605359bac902e2eeb5cff52714f29071c3ff6ee69bbe8f6c077c695b41: Status 404 returned error can't find the container with id edb286605359bac902e2eeb5cff52714f29071c3ff6ee69bbe8f6c077c695b41 Apr 17 15:17:26.754917 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:26.754892 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4299c8c_3050_4ce9_9766_13f14ff297a7.slice/crio-34b1dc298673654f9c339899038552e22389001f08012e7a80f86c492a82f46b WatchSource:0}: Error finding container 34b1dc298673654f9c339899038552e22389001f08012e7a80f86c492a82f46b: Status 404 returned error can't find the container with id 34b1dc298673654f9c339899038552e22389001f08012e7a80f86c492a82f46b Apr 17 15:17:26.775610 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:26.775578 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03062b60_45de_4e91_92e7_3959d5322bd1.slice/crio-1f198f0ff128b6d808b96ff412dcadb76321d861631c4fe407456300c84b6c9f WatchSource:0}: Error finding container 1f198f0ff128b6d808b96ff412dcadb76321d861631c4fe407456300c84b6c9f: Status 404 returned error can't find the container with id 1f198f0ff128b6d808b96ff412dcadb76321d861631c4fe407456300c84b6c9f Apr 17 15:17:26.776381 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:26.776360 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda00974bb_abc9_4285_909c_842f9c69b1f3.slice/crio-88dc32b29c559f3b3be2bfedceeab67042f1aa35f3ab278844bd2c72b81b942b WatchSource:0}: Error finding container 88dc32b29c559f3b3be2bfedceeab67042f1aa35f3ab278844bd2c72b81b942b: Status 404 returned error can't find the container with id 88dc32b29c559f3b3be2bfedceeab67042f1aa35f3ab278844bd2c72b81b942b Apr 17 15:17:26.777236 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:26.777211 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bcb7f53_9455_4609_b5aa_4190a92c8d15.slice/crio-ea076b5cf92d86da09c23fe03834d2c62dc9df0525f2650a48f9e887ce8b784d WatchSource:0}: Error finding container ea076b5cf92d86da09c23fe03834d2c62dc9df0525f2650a48f9e887ce8b784d: Status 404 returned error can't find the container with id ea076b5cf92d86da09c23fe03834d2c62dc9df0525f2650a48f9e887ce8b784d Apr 17 15:17:26.778210 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:26.778181 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e3cfe5e_0c86_4d14_ac14_7390274f338b.slice/crio-301ff8597cd2f14a03aff7963b15b2bbabf160c06787155057257c013d3d5ebd WatchSource:0}: Error finding container 301ff8597cd2f14a03aff7963b15b2bbabf160c06787155057257c013d3d5ebd: Status 404 returned error can't find the container with id 301ff8597cd2f14a03aff7963b15b2bbabf160c06787155057257c013d3d5ebd Apr 17 15:17:26.811716 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:26.811684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z72mk\" (UniqueName: \"kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk\") pod \"network-check-target-n2x89\" (UID: \"7400fb35-d1c0-4009-bdc4-483256d99f9d\") " pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:26.811830 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.811802 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:26.811830 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.811817 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:26.811830 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.811828 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z72mk for pod openshift-network-diagnostics/network-check-target-n2x89: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:26.811947 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:26.811877 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk podName:7400fb35-d1c0-4009-bdc4-483256d99f9d nodeName:}" failed. No retries permitted until 2026-04-17 15:17:27.811862773 +0000 UTC m=+4.268044845 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z72mk" (UniqueName: "kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk") pod "network-check-target-n2x89" (UID: "7400fb35-d1c0-4009-bdc4-483256d99f9d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:27.039019 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.038935 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 15:12:25 +0000 UTC" deadline="2027-11-17 20:05:54.511320209 +0000 UTC" Apr 17 15:17:27.039019 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.038969 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13900h48m27.472353693s" Apr 17 15:17:27.134964 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.134938 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:27.135250 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:27.135063 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:27.143400 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.143359 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" event={"ID":"5401552a10b9bd31fa1f4a18dcace9bb","Type":"ContainerStarted","Data":"4990aa49a9a44f77320995dddf8ceb38db62a3e572b3e4bdf740b75533ed40e4"} Apr 17 15:17:27.144519 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.144481 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wqrkq" event={"ID":"a00974bb-abc9-4285-909c-842f9c69b1f3","Type":"ContainerStarted","Data":"88dc32b29c559f3b3be2bfedceeab67042f1aa35f3ab278844bd2c72b81b942b"} Apr 17 15:17:27.145493 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.145474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jgfqk" event={"ID":"f4299c8c-3050-4ce9-9766-13f14ff297a7","Type":"ContainerStarted","Data":"34b1dc298673654f9c339899038552e22389001f08012e7a80f86c492a82f46b"} Apr 17 15:17:27.146497 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.146476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" event={"ID":"fc69e676-8342-4380-a1ba-56fbb970d9d9","Type":"ContainerStarted","Data":"402ed54f69a99bfd7eab2595391f0db0c4c455401498f397b889efd07ea99230"} Apr 17 15:17:27.147331 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.147301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2vktv" event={"ID":"6e474365-0ff7-4228-b6b7-3f49bc17a45b","Type":"ContainerStarted","Data":"e1ec291999f3c1bdcda50f374b1ae2feb2cf5e7f028efda8cb7a004f7f169fd1"} Apr 17 15:17:27.148132 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.148112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bpw5n" event={"ID":"e70e8fd7-f8f2-4303-8371-1696921c6746","Type":"ContainerStarted","Data":"44bb5f19fffdd905e92b11f427eb10fc8157636666ddc352b4212cb96686030b"} Apr 17 15:17:27.149009 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.148980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" event={"ID":"03062b60-45de-4e91-92e7-3959d5322bd1","Type":"ContainerStarted","Data":"1f198f0ff128b6d808b96ff412dcadb76321d861631c4fe407456300c84b6c9f"} Apr 17 15:17:27.149895 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.149867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" event={"ID":"1bcb7f53-9455-4609-b5aa-4190a92c8d15","Type":"ContainerStarted","Data":"ea076b5cf92d86da09c23fe03834d2c62dc9df0525f2650a48f9e887ce8b784d"} Apr 17 15:17:27.152952 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.152924 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" event={"ID":"1e3cfe5e-0c86-4d14-ac14-7390274f338b","Type":"ContainerStarted","Data":"301ff8597cd2f14a03aff7963b15b2bbabf160c06787155057257c013d3d5ebd"} Apr 17 15:17:27.153934 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.153913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vndw6" event={"ID":"a89cc04d-c377-4ac2-9120-63ebc1ca2990","Type":"ContainerStarted","Data":"edb286605359bac902e2eeb5cff52714f29071c3ff6ee69bbe8f6c077c695b41"} Apr 17 15:17:27.154175 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.154133 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" podStartSLOduration=2.154122597 podStartE2EDuration="2.154122597s" podCreationTimestamp="2026-04-17 15:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:17:27.153845349 +0000 UTC m=+3.610027445" watchObservedRunningTime="2026-04-17 15:17:27.154122597 +0000 UTC m=+3.610304692" Apr 17 15:17:27.718435 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.718394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:27.718635 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:27.718560 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:27.718635 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:27.718625 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs podName:41bb8b03-e874-455f-8416-b76d91f0f117 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:29.71860608 +0000 UTC m=+6.174788155 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs") pod "network-metrics-daemon-82cq8" (UID: "41bb8b03-e874-455f-8416-b76d91f0f117") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:27.820190 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:27.819497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z72mk\" (UniqueName: \"kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk\") pod \"network-check-target-n2x89\" (UID: \"7400fb35-d1c0-4009-bdc4-483256d99f9d\") " pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:27.820190 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:27.819694 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:27.820190 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:27.819714 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:27.820190 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:27.819726 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z72mk for pod openshift-network-diagnostics/network-check-target-n2x89: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:27.820190 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:27.819786 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk podName:7400fb35-d1c0-4009-bdc4-483256d99f9d nodeName:}" failed. No retries permitted until 2026-04-17 15:17:29.819767462 +0000 UTC m=+6.275949540 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z72mk" (UniqueName: "kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk") pod "network-check-target-n2x89" (UID: "7400fb35-d1c0-4009-bdc4-483256d99f9d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:28.138648 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:28.138012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:28.138648 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:28.138159 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:28.180077 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:28.179828 2576 generic.go:358] "Generic (PLEG): container finished" podID="e52f89589d514c455852c1cdd49a71bd" containerID="48a81df109aa479583007ac33de4561d34fba37bac7bd3c214e3d6572e2d0aee" exitCode=0 Apr 17 15:17:28.180268 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:28.180128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" event={"ID":"e52f89589d514c455852c1cdd49a71bd","Type":"ContainerDied","Data":"48a81df109aa479583007ac33de4561d34fba37bac7bd3c214e3d6572e2d0aee"} Apr 17 15:17:29.135911 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:29.135880 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:29.136137 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:29.136020 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:29.196177 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:29.196135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" event={"ID":"e52f89589d514c455852c1cdd49a71bd","Type":"ContainerStarted","Data":"9b378e1cbf50c41b2cae558b21d65c595cacc8a3b264ea8137bec1613a3c2863"} Apr 17 15:17:29.207477 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:29.207424 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" podStartSLOduration=4.207405117 podStartE2EDuration="4.207405117s" podCreationTimestamp="2026-04-17 15:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:17:29.207247056 +0000 UTC m=+5.663429174" watchObservedRunningTime="2026-04-17 15:17:29.207405117 +0000 UTC m=+5.663587205" Apr 17 15:17:29.738747 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:29.738705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:29.738993 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:29.738937 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:29.739105 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:29.739010 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs podName:41bb8b03-e874-455f-8416-b76d91f0f117 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:33.738990582 +0000 UTC m=+10.195172658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs") pod "network-metrics-daemon-82cq8" (UID: "41bb8b03-e874-455f-8416-b76d91f0f117") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:29.839220 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:29.839178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z72mk\" (UniqueName: \"kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk\") pod \"network-check-target-n2x89\" (UID: \"7400fb35-d1c0-4009-bdc4-483256d99f9d\") " pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:29.839405 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:29.839382 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:29.839481 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:29.839411 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:29.839481 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:29.839424 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z72mk for pod openshift-network-diagnostics/network-check-target-n2x89: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:29.839589 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:29.839487 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk podName:7400fb35-d1c0-4009-bdc4-483256d99f9d nodeName:}" failed. No retries permitted until 2026-04-17 15:17:33.839466883 +0000 UTC m=+10.295648964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-z72mk" (UniqueName: "kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk") pod "network-check-target-n2x89" (UID: "7400fb35-d1c0-4009-bdc4-483256d99f9d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:30.136617 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:30.136533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:30.136772 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:30.136673 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:31.135420 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:31.135384 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:31.135851 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:31.135546 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:32.135284 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:32.135240 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:32.135473 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:32.135378 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:33.135745 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:33.135712 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:33.136310 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:33.135847 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:33.777803 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:33.777581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:33.777803 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:33.777765 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:33.778089 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:33.777834 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs podName:41bb8b03-e874-455f-8416-b76d91f0f117 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:41.77781431 +0000 UTC m=+18.233996384 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs") pod "network-metrics-daemon-82cq8" (UID: "41bb8b03-e874-455f-8416-b76d91f0f117") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:33.878700 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:33.878654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z72mk\" (UniqueName: \"kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk\") pod \"network-check-target-n2x89\" (UID: \"7400fb35-d1c0-4009-bdc4-483256d99f9d\") " pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:33.878883 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:33.878814 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:33.878883 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:33.878843 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:33.878883 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:33.878858 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z72mk for pod openshift-network-diagnostics/network-check-target-n2x89: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:33.879068 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:33.878924 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk podName:7400fb35-d1c0-4009-bdc4-483256d99f9d nodeName:}" failed. No retries permitted until 2026-04-17 15:17:41.878904692 +0000 UTC m=+18.335086776 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-z72mk" (UniqueName: "kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk") pod "network-check-target-n2x89" (UID: "7400fb35-d1c0-4009-bdc4-483256d99f9d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:34.137103 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:34.136781 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:34.137545 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:34.137152 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:35.135780 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:35.135742 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:35.135958 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:35.135886 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:36.136037 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:36.135962 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:36.136449 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:36.136107 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:37.135179 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:37.135144 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:37.135376 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:37.135281 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:38.135422 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:38.135387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:38.135975 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:38.135500 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:39.135715 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:39.135680 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:39.136185 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:39.135810 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:40.135879 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:40.135840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:40.136352 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:40.135971 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:41.135031 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:41.134997 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:41.135202 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:41.135132 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:41.838770 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:41.838730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:41.839217 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:41.838885 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:41.839217 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:41.838950 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs podName:41bb8b03-e874-455f-8416-b76d91f0f117 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:57.838933463 +0000 UTC m=+34.295115539 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs") pod "network-metrics-daemon-82cq8" (UID: "41bb8b03-e874-455f-8416-b76d91f0f117") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 15:17:41.939881 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:41.939835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z72mk\" (UniqueName: \"kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk\") pod \"network-check-target-n2x89\" (UID: \"7400fb35-d1c0-4009-bdc4-483256d99f9d\") " pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:41.940077 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:41.940034 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 15:17:41.940077 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:41.940081 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 15:17:41.940285 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:41.940096 2576 projected.go:194] Error preparing data for projected volume kube-api-access-z72mk for pod openshift-network-diagnostics/network-check-target-n2x89: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:41.940285 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:41.940164 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk podName:7400fb35-d1c0-4009-bdc4-483256d99f9d nodeName:}" failed. No retries permitted until 2026-04-17 15:17:57.940144514 +0000 UTC m=+34.396326590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-z72mk" (UniqueName: "kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk") pod "network-check-target-n2x89" (UID: "7400fb35-d1c0-4009-bdc4-483256d99f9d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 15:17:42.135085 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:42.134988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:42.135228 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:42.135141 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:43.135491 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:43.135456 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:43.135954 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:43.135591 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:44.136573 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.136399 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:44.137358 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:44.136629 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:44.224421 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.224380 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wqrkq" event={"ID":"a00974bb-abc9-4285-909c-842f9c69b1f3","Type":"ContainerStarted","Data":"d87778acbfd791a5323fd35e75c67ff3dbc39564d10235e6b6a85a49a783120d"} Apr 17 15:17:44.225896 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.225862 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-jgfqk" event={"ID":"f4299c8c-3050-4ce9-9766-13f14ff297a7","Type":"ContainerStarted","Data":"ee1aa01cb8e9126e3cd9a64de9432ee1e91f0702b663fadc56ab9c24edeb5dc9"} Apr 17 15:17:44.227291 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.227260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" event={"ID":"fc69e676-8342-4380-a1ba-56fbb970d9d9","Type":"ContainerStarted","Data":"d3899f9918639486931f66369118ec8eb5cb530bb71288117644feec4e97b581"} Apr 17 15:17:44.228661 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.228630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bpw5n" event={"ID":"e70e8fd7-f8f2-4303-8371-1696921c6746","Type":"ContainerStarted","Data":"260cfc4243156796e1685ca1f42441127d0fd4e88516ba742ba6680b6aae4735"} Apr 17 15:17:44.230057 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.230011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" event={"ID":"03062b60-45de-4e91-92e7-3959d5322bd1","Type":"ContainerStarted","Data":"6e9d9bac31576112441ee07400c2d58c23712b56652dc32409a934ca98310299"} Apr 17 15:17:44.231275 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.231254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" event={"ID":"1bcb7f53-9455-4609-b5aa-4190a92c8d15","Type":"ContainerStarted","Data":"da70d1dc4a63f8f556cb76d73621be5905a63fe025a7291273df37335ea63188"} Apr 17 15:17:44.233081 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.233043 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:17:44.233404 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.233378 2576 generic.go:358] "Generic (PLEG): container finished" podID="1e3cfe5e-0c86-4d14-ac14-7390274f338b" containerID="1c3d49dcbf4731704d2baea45bd87bb982e17298be4b8c79be7ae1b90f49a491" exitCode=1 Apr 17 15:17:44.233551 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.233438 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" event={"ID":"1e3cfe5e-0c86-4d14-ac14-7390274f338b","Type":"ContainerStarted","Data":"7176381a69dca20866fda67c01ab21bb9309a3a25a119b894bdef4f28f25ba4e"} Apr 17 15:17:44.233551 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.233474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" event={"ID":"1e3cfe5e-0c86-4d14-ac14-7390274f338b","Type":"ContainerDied","Data":"1c3d49dcbf4731704d2baea45bd87bb982e17298be4b8c79be7ae1b90f49a491"} Apr 17 15:17:44.233551 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.233497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" event={"ID":"1e3cfe5e-0c86-4d14-ac14-7390274f338b","Type":"ContainerStarted","Data":"3e920fd93fbf8dc186953d94d825cc0719cf903cac700b371f4fd9a1951cfacf"} Apr 17 15:17:44.234910 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.234892 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vndw6" event={"ID":"a89cc04d-c377-4ac2-9120-63ebc1ca2990","Type":"ContainerStarted","Data":"3af3f8d11b6d1757cbc26e52339b37797601acc43a19a21a9131c2410795e91a"} Apr 17 15:17:44.237687 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.237647 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wqrkq" podStartSLOduration=11.150696309 podStartE2EDuration="20.237634976s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:26.781222658 +0000 UTC m=+3.237404731" lastFinishedPulling="2026-04-17 15:17:35.868161311 +0000 UTC m=+12.324343398" observedRunningTime="2026-04-17 15:17:44.237483426 +0000 UTC m=+20.693665523" watchObservedRunningTime="2026-04-17 15:17:44.237634976 +0000 UTC m=+20.693817074" Apr 17 15:17:44.251183 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.251125 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bpw5n" podStartSLOduration=3.439153905 podStartE2EDuration="20.251111171s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:26.750207433 +0000 UTC m=+3.206389510" lastFinishedPulling="2026-04-17 15:17:43.562164697 +0000 UTC m=+20.018346776" observedRunningTime="2026-04-17 15:17:44.250144221 +0000 UTC m=+20.706326324" watchObservedRunningTime="2026-04-17 15:17:44.251111171 +0000 UTC m=+20.707293266" Apr 17 15:17:44.263969 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.263916 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-jgfqk" podStartSLOduration=3.476043754 podStartE2EDuration="20.263900019s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:26.774316224 +0000 UTC m=+3.230498297" lastFinishedPulling="2026-04-17 15:17:43.562172484 +0000 UTC m=+20.018354562" observedRunningTime="2026-04-17 15:17:44.263292398 +0000 UTC m=+20.719474494" watchObservedRunningTime="2026-04-17 15:17:44.263900019 +0000 UTC m=+20.720082135" Apr 17 15:17:44.282444 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.282394 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vndw6" podStartSLOduration=3.356711184 podStartE2EDuration="20.282378434s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:26.754142171 +0000 UTC m=+3.210324244" lastFinishedPulling="2026-04-17 15:17:43.679809408 +0000 UTC m=+20.135991494" observedRunningTime="2026-04-17 15:17:44.282170034 +0000 UTC m=+20.738352140" watchObservedRunningTime="2026-04-17 15:17:44.282378434 +0000 UTC m=+20.738560529" Apr 17 15:17:44.297503 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.297460 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-7sgdp" podStartSLOduration=3.504540418 podStartE2EDuration="20.297445086s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:26.777677163 +0000 UTC m=+3.233859236" lastFinishedPulling="2026-04-17 15:17:43.570581831 +0000 UTC m=+20.026763904" observedRunningTime="2026-04-17 15:17:44.296941754 +0000 UTC m=+20.753123849" watchObservedRunningTime="2026-04-17 15:17:44.297445086 +0000 UTC m=+20.753627249" Apr 17 15:17:44.869719 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:44.869552 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 15:17:45.073904 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.073757 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T15:17:44.86971421Z","UUID":"e75fd592-d4ca-4784-85be-49383abc33d2","Handler":null,"Name":"","Endpoint":""} Apr 17 15:17:45.076409 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.076386 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 15:17:45.076409 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.076412 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 15:17:45.135165 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.135132 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:45.135342 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:45.135240 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:45.238236 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.238202 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc69e676-8342-4380-a1ba-56fbb970d9d9" containerID="d3899f9918639486931f66369118ec8eb5cb530bb71288117644feec4e97b581" exitCode=0 Apr 17 15:17:45.238786 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.238275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" event={"ID":"fc69e676-8342-4380-a1ba-56fbb970d9d9","Type":"ContainerDied","Data":"d3899f9918639486931f66369118ec8eb5cb530bb71288117644feec4e97b581"} Apr 17 15:17:45.239681 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.239661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2vktv" event={"ID":"6e474365-0ff7-4228-b6b7-3f49bc17a45b","Type":"ContainerStarted","Data":"18605ece9491e7418e6f5a0a372974f594240ea41e50245089b095eeb73d0985"} Apr 17 15:17:45.241293 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.241274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" event={"ID":"1bcb7f53-9455-4609-b5aa-4190a92c8d15","Type":"ContainerStarted","Data":"5eb10b4b852f01285c449d1d861977023e6d40e86dd5e9dec04e09f592720f94"} Apr 17 15:17:45.243571 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.243557 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:17:45.243963 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.243943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" event={"ID":"1e3cfe5e-0c86-4d14-ac14-7390274f338b","Type":"ContainerStarted","Data":"f2212c2e0e7ca53d85958c73fb4855b3005efefc7092c70019bb9f5e31204c88"} Apr 17 15:17:45.244031 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.243970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" event={"ID":"1e3cfe5e-0c86-4d14-ac14-7390274f338b","Type":"ContainerStarted","Data":"b2207254ae5df0ac4f6e8364989ecda5e3343ad6843023958fe436ac2e8f9e00"} Apr 17 15:17:45.244031 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.243982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" event={"ID":"1e3cfe5e-0c86-4d14-ac14-7390274f338b","Type":"ContainerStarted","Data":"1d5cb9b4aa7ee1edca5ab48f1dcac698df3b5c811b1e278a3e03c8b3d9b076aa"} Apr 17 15:17:45.271478 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:45.271434 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2vktv" podStartSLOduration=4.460570204 podStartE2EDuration="21.271422603s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:26.751264424 +0000 UTC m=+3.207446516" lastFinishedPulling="2026-04-17 15:17:43.562116828 +0000 UTC m=+20.018298915" observedRunningTime="2026-04-17 15:17:45.271157347 +0000 UTC m=+21.727339442" watchObservedRunningTime="2026-04-17 15:17:45.271422603 +0000 UTC m=+21.727604698" Apr 17 15:17:46.135710 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:46.135678 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:46.135914 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:46.135800 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:46.246831 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:46.246792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" event={"ID":"1bcb7f53-9455-4609-b5aa-4190a92c8d15","Type":"ContainerStarted","Data":"697d78efa386dedfa65da08fd8bd39ae0d5277dc25b01a2e8645a6bd03762641"} Apr 17 15:17:47.135878 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:47.135847 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:47.136084 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:47.135969 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:47.252032 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:47.252002 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:17:47.252592 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:47.252433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" event={"ID":"1e3cfe5e-0c86-4d14-ac14-7390274f338b","Type":"ContainerStarted","Data":"b08c58edd26d7d03ad4b4afead913231e2a4f2511336f80474ea7ad21c5de443"} Apr 17 15:17:47.268122 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:47.268073 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-xmhkz" podStartSLOduration=3.978989939 podStartE2EDuration="23.268043463s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:26.781226374 +0000 UTC m=+3.237408447" lastFinishedPulling="2026-04-17 15:17:46.070279898 +0000 UTC m=+22.526461971" observedRunningTime="2026-04-17 15:17:47.267634459 +0000 UTC m=+23.723816565" watchObservedRunningTime="2026-04-17 15:17:47.268043463 +0000 UTC m=+23.724225558" Apr 17 15:17:48.135383 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:48.135346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:48.135565 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:48.135474 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:48.980960 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:48.980924 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:48.981931 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:48.981911 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:49.135433 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:49.135389 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:49.135593 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:49.135529 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:49.256937 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:49.256864 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:49.257445 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:49.257422 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-jgfqk" Apr 17 15:17:50.135816 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:50.135572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:50.136265 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:50.135848 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:50.259755 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:50.259721 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc69e676-8342-4380-a1ba-56fbb970d9d9" containerID="3a264a30a9eb6a594aa60b72369d7c2f0d98ad401611eae9ab9c11343baaa870" exitCode=0 Apr 17 15:17:50.259908 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:50.259814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" event={"ID":"fc69e676-8342-4380-a1ba-56fbb970d9d9","Type":"ContainerDied","Data":"3a264a30a9eb6a594aa60b72369d7c2f0d98ad401611eae9ab9c11343baaa870"} Apr 17 15:17:50.262901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:50.262885 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:17:50.263227 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:50.263203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" event={"ID":"1e3cfe5e-0c86-4d14-ac14-7390274f338b","Type":"ContainerStarted","Data":"23dc55fb69cf719ff1346bbf365dc4048946cf56b2b525e2cb5d3349d492d5ee"} Apr 17 15:17:50.263568 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:50.263548 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:50.263649 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:50.263574 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:50.263710 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:50.263694 2576 scope.go:117] "RemoveContainer" containerID="1c3d49dcbf4731704d2baea45bd87bb982e17298be4b8c79be7ae1b90f49a491" Apr 17 15:17:50.278435 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:50.278410 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:50.278541 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:50.278526 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:51.135788 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:51.135757 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:51.135986 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:51.135902 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:51.268581 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:51.268555 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:17:51.268882 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:51.268859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" event={"ID":"1e3cfe5e-0c86-4d14-ac14-7390274f338b","Type":"ContainerStarted","Data":"53921c37a617b0563509faf4efb8b2950c1b6e5443f9c4996169bca5958263e8"} Apr 17 15:17:51.268990 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:51.268976 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 15:17:51.299724 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:51.299675 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" podStartSLOduration=10.342289914 podStartE2EDuration="27.299657579s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:26.781220981 +0000 UTC m=+3.237403054" lastFinishedPulling="2026-04-17 15:17:43.738588643 +0000 UTC m=+20.194770719" observedRunningTime="2026-04-17 15:17:51.297809625 +0000 UTC m=+27.753991745" watchObservedRunningTime="2026-04-17 15:17:51.299657579 +0000 UTC m=+27.755839674" Apr 17 15:17:51.573192 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:51.572335 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n2x89"] Apr 17 15:17:51.573192 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:51.572842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:51.573192 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:51.572946 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:51.573192 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:51.573005 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-82cq8"] Apr 17 15:17:51.573192 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:51.573168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:51.573553 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:51.573274 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:52.274514 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:52.274480 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc69e676-8342-4380-a1ba-56fbb970d9d9" containerID="e49b305fd47562bf3f365a67396244c7035702e4b65c3c9abeb6f78251aeb2b0" exitCode=0 Apr 17 15:17:52.274964 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:52.274568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" event={"ID":"fc69e676-8342-4380-a1ba-56fbb970d9d9","Type":"ContainerDied","Data":"e49b305fd47562bf3f365a67396244c7035702e4b65c3c9abeb6f78251aeb2b0"} Apr 17 15:17:52.274964 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:52.274788 2576 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 15:17:52.797771 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:52.797711 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:17:53.135555 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:53.135483 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:53.135555 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:53.135507 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:53.135727 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:53.135597 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:53.135773 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:53.135749 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:54.279941 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:54.279903 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc69e676-8342-4380-a1ba-56fbb970d9d9" containerID="36ded8e59c4e57cf443eee34f8291b4f3a30322def6982896e8be90e331aea71" exitCode=0 Apr 17 15:17:54.279941 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:54.279946 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" event={"ID":"fc69e676-8342-4380-a1ba-56fbb970d9d9","Type":"ContainerDied","Data":"36ded8e59c4e57cf443eee34f8291b4f3a30322def6982896e8be90e331aea71"} Apr 17 15:17:55.134961 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:55.134933 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:55.135165 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:55.134935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:55.135165 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:55.135079 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-n2x89" podUID="7400fb35-d1c0-4009-bdc4-483256d99f9d" Apr 17 15:17:55.135272 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:55.135171 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:17:56.905511 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.905474 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeReady" Apr 17 15:17:56.905976 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.905624 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 15:17:56.947073 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.947026 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dw8kz"] Apr 17 15:17:56.965858 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.965826 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4g86w"] Apr 17 15:17:56.966077 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.966039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:56.969532 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.969349 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 15:17:56.969648 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.969385 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p8skx\"" Apr 17 15:17:56.971830 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.970939 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 15:17:56.983406 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.983380 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dw8kz"] Apr 17 15:17:56.983525 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.983416 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4g86w"] Apr 17 15:17:56.983525 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.983521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:17:56.986122 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.986099 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 15:17:56.986669 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.986532 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 15:17:56.986669 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.986577 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 15:17:56.986669 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:56.986582 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wwl79\"" Apr 17 15:17:57.059845 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.059811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.059845 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.059854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpp24\" (UniqueName: \"kubernetes.io/projected/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-kube-api-access-rpp24\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.060112 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.059929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-tmp-dir\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.060112 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.059971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-config-volume\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.135228 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.135186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:57.135436 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.135190 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:57.137946 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.137922 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 15:17:57.138107 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.137929 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 15:17:57.138107 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.137932 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9m884\"" Apr 17 15:17:57.138107 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.138035 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sx2cz\"" Apr 17 15:17:57.138278 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.138262 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 15:17:57.161069 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.160996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-tmp-dir\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.161069 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.161034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-config-volume\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.161248 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.161086 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:17:57.161248 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.161140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8szh\" (UniqueName: \"kubernetes.io/projected/b80c4e77-d795-4111-a247-f612ad85f926-kube-api-access-r8szh\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:17:57.161248 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.161179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.161372 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.161255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpp24\" (UniqueName: \"kubernetes.io/projected/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-kube-api-access-rpp24\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.161372 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:57.161268 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:57.161372 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:57.161327 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls podName:37750dd6-91d9-43c3-a3ac-75a2c4ce6eec nodeName:}" failed. No retries permitted until 2026-04-17 15:17:57.661308464 +0000 UTC m=+34.117490558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls") pod "dns-default-dw8kz" (UID: "37750dd6-91d9-43c3-a3ac-75a2c4ce6eec") : secret "dns-default-metrics-tls" not found Apr 17 15:17:57.161527 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.161373 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-tmp-dir\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.161843 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.161790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-config-volume\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.171607 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.171579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpp24\" (UniqueName: \"kubernetes.io/projected/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-kube-api-access-rpp24\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.261618 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.261561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:17:57.261798 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.261630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8szh\" (UniqueName: \"kubernetes.io/projected/b80c4e77-d795-4111-a247-f612ad85f926-kube-api-access-r8szh\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:17:57.261798 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:57.261731 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:57.261879 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:57.261817 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert podName:b80c4e77-d795-4111-a247-f612ad85f926 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:57.761797043 +0000 UTC m=+34.217979120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert") pod "ingress-canary-4g86w" (UID: "b80c4e77-d795-4111-a247-f612ad85f926") : secret "canary-serving-cert" not found Apr 17 15:17:57.272430 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.272402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8szh\" (UniqueName: \"kubernetes.io/projected/b80c4e77-d795-4111-a247-f612ad85f926-kube-api-access-r8szh\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:17:57.664728 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.664689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:57.664934 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:57.664850 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:57.664934 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:57.664932 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls podName:37750dd6-91d9-43c3-a3ac-75a2c4ce6eec nodeName:}" failed. No retries permitted until 2026-04-17 15:17:58.664915675 +0000 UTC m=+35.121097748 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls") pod "dns-default-dw8kz" (UID: "37750dd6-91d9-43c3-a3ac-75a2c4ce6eec") : secret "dns-default-metrics-tls" not found Apr 17 15:17:57.765735 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.765695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:17:57.765901 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:57.765817 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:57.765901 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:57.765881 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert podName:b80c4e77-d795-4111-a247-f612ad85f926 nodeName:}" failed. No retries permitted until 2026-04-17 15:17:58.76586634 +0000 UTC m=+35.222048412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert") pod "ingress-canary-4g86w" (UID: "b80c4e77-d795-4111-a247-f612ad85f926") : secret "canary-serving-cert" not found Apr 17 15:17:57.866766 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.866731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:17:57.866940 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:57.866849 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 15:17:57.866940 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:57.866907 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs podName:41bb8b03-e874-455f-8416-b76d91f0f117 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:29.866891776 +0000 UTC m=+66.323073851 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs") pod "network-metrics-daemon-82cq8" (UID: "41bb8b03-e874-455f-8416-b76d91f0f117") : secret "metrics-daemon-secret" not found Apr 17 15:17:57.967836 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.967799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z72mk\" (UniqueName: \"kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk\") pod \"network-check-target-n2x89\" (UID: \"7400fb35-d1c0-4009-bdc4-483256d99f9d\") " pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:57.970798 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:57.970768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z72mk\" (UniqueName: \"kubernetes.io/projected/7400fb35-d1c0-4009-bdc4-483256d99f9d-kube-api-access-z72mk\") pod \"network-check-target-n2x89\" (UID: \"7400fb35-d1c0-4009-bdc4-483256d99f9d\") " pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:58.045643 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:58.045607 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:17:58.207861 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:58.207826 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-n2x89"] Apr 17 15:17:58.212328 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:17:58.212283 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7400fb35_d1c0_4009_bdc4_483256d99f9d.slice/crio-7a5acd81989b23c6a382b00510e990f43520c869d3e8a6921a0ab03a690c8f3e WatchSource:0}: Error finding container 7a5acd81989b23c6a382b00510e990f43520c869d3e8a6921a0ab03a690c8f3e: Status 404 returned error can't find the container with id 7a5acd81989b23c6a382b00510e990f43520c869d3e8a6921a0ab03a690c8f3e Apr 17 15:17:58.288728 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:58.288646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n2x89" event={"ID":"7400fb35-d1c0-4009-bdc4-483256d99f9d","Type":"ContainerStarted","Data":"7a5acd81989b23c6a382b00510e990f43520c869d3e8a6921a0ab03a690c8f3e"} Apr 17 15:17:58.673753 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:58.673676 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:17:58.673985 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:58.673827 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:17:58.673985 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:58.673892 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls podName:37750dd6-91d9-43c3-a3ac-75a2c4ce6eec nodeName:}" failed. No retries permitted until 2026-04-17 15:18:00.673875525 +0000 UTC m=+37.130057598 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls") pod "dns-default-dw8kz" (UID: "37750dd6-91d9-43c3-a3ac-75a2c4ce6eec") : secret "dns-default-metrics-tls" not found Apr 17 15:17:58.774178 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:17:58.774141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:17:58.774352 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:58.774301 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:17:58.774396 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:17:58.774382 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert podName:b80c4e77-d795-4111-a247-f612ad85f926 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:00.77435994 +0000 UTC m=+37.230542033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert") pod "ingress-canary-4g86w" (UID: "b80c4e77-d795-4111-a247-f612ad85f926") : secret "canary-serving-cert" not found Apr 17 15:18:00.692398 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:00.692357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:18:00.692858 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:00.692544 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:18:00.692858 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:00.692629 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls podName:37750dd6-91d9-43c3-a3ac-75a2c4ce6eec nodeName:}" failed. No retries permitted until 2026-04-17 15:18:04.692608127 +0000 UTC m=+41.148790204 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls") pod "dns-default-dw8kz" (UID: "37750dd6-91d9-43c3-a3ac-75a2c4ce6eec") : secret "dns-default-metrics-tls" not found Apr 17 15:18:00.793028 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:00.792990 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:18:00.793236 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:00.793139 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:18:00.793236 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:00.793218 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert podName:b80c4e77-d795-4111-a247-f612ad85f926 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:04.793196983 +0000 UTC m=+41.249379068 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert") pod "ingress-canary-4g86w" (UID: "b80c4e77-d795-4111-a247-f612ad85f926") : secret "canary-serving-cert" not found Apr 17 15:18:03.300792 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:03.300547 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc69e676-8342-4380-a1ba-56fbb970d9d9" containerID="7a4fda3654086360fc91ab12f874b1a2d664734840cce16221341e2c33bb95c9" exitCode=0 Apr 17 15:18:03.300792 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:03.300626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" event={"ID":"fc69e676-8342-4380-a1ba-56fbb970d9d9","Type":"ContainerDied","Data":"7a4fda3654086360fc91ab12f874b1a2d664734840cce16221341e2c33bb95c9"} Apr 17 15:18:03.302122 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:03.302101 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-n2x89" event={"ID":"7400fb35-d1c0-4009-bdc4-483256d99f9d","Type":"ContainerStarted","Data":"e37abf9333563b82cfaefd59e27fad6a31ac30363cf45a9ac6f4dc29f9fc58be"} Apr 17 15:18:03.302242 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:03.302211 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:18:03.336957 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:03.336903 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-n2x89" podStartSLOduration=34.796296639 podStartE2EDuration="39.336882648s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:58.214600489 +0000 UTC m=+34.670782576" lastFinishedPulling="2026-04-17 15:18:02.755186511 +0000 UTC m=+39.211368585" observedRunningTime="2026-04-17 15:18:03.336343926 +0000 UTC m=+39.792526021" watchObservedRunningTime="2026-04-17 15:18:03.336882648 +0000 UTC m=+39.793064743" Apr 17 15:18:04.307639 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:04.306788 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc69e676-8342-4380-a1ba-56fbb970d9d9" containerID="7072e9afce02c692840a40d6185574e58e4a17188c1eeb1339373445f516376c" exitCode=0 Apr 17 15:18:04.307639 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:04.307174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" event={"ID":"fc69e676-8342-4380-a1ba-56fbb970d9d9","Type":"ContainerDied","Data":"7072e9afce02c692840a40d6185574e58e4a17188c1eeb1339373445f516376c"} Apr 17 15:18:04.721802 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:04.721767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:18:04.721947 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:04.721919 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:18:04.722004 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:04.721990 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls podName:37750dd6-91d9-43c3-a3ac-75a2c4ce6eec nodeName:}" failed. No retries permitted until 2026-04-17 15:18:12.721973602 +0000 UTC m=+49.178155674 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls") pod "dns-default-dw8kz" (UID: "37750dd6-91d9-43c3-a3ac-75a2c4ce6eec") : secret "dns-default-metrics-tls" not found Apr 17 15:18:04.822817 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:04.822781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:18:04.822974 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:04.822916 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:18:04.823017 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:04.822978 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert podName:b80c4e77-d795-4111-a247-f612ad85f926 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:12.822961862 +0000 UTC m=+49.279143935 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert") pod "ingress-canary-4g86w" (UID: "b80c4e77-d795-4111-a247-f612ad85f926") : secret "canary-serving-cert" not found Apr 17 15:18:05.311025 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:05.310982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" event={"ID":"fc69e676-8342-4380-a1ba-56fbb970d9d9","Type":"ContainerStarted","Data":"0f30099957a14e91ceb67b37bbb1863d1ef8098d45ef529d29b5b5948eb4b8af"} Apr 17 15:18:05.333415 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:05.333339 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ffhn2" podStartSLOduration=5.320003862 podStartE2EDuration="41.333322801s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:17:26.753262119 +0000 UTC m=+3.209444198" lastFinishedPulling="2026-04-17 15:18:02.766581062 +0000 UTC m=+39.222763137" observedRunningTime="2026-04-17 15:18:05.331891298 +0000 UTC m=+41.788073390" watchObservedRunningTime="2026-04-17 15:18:05.333322801 +0000 UTC m=+41.789504898" Apr 17 15:18:12.768922 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:12.768880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:18:12.769527 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:12.769041 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:18:12.769527 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:12.769158 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls podName:37750dd6-91d9-43c3-a3ac-75a2c4ce6eec nodeName:}" failed. No retries permitted until 2026-04-17 15:18:28.769135408 +0000 UTC m=+65.225317482 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls") pod "dns-default-dw8kz" (UID: "37750dd6-91d9-43c3-a3ac-75a2c4ce6eec") : secret "dns-default-metrics-tls" not found Apr 17 15:18:12.869338 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:12.869294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:18:12.869503 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:12.869404 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:18:12.869503 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:12.869462 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert podName:b80c4e77-d795-4111-a247-f612ad85f926 nodeName:}" failed. No retries permitted until 2026-04-17 15:18:28.869448453 +0000 UTC m=+65.325630526 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert") pod "ingress-canary-4g86w" (UID: "b80c4e77-d795-4111-a247-f612ad85f926") : secret "canary-serving-cert" not found Apr 17 15:18:15.161404 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.161371 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c"] Apr 17 15:18:15.198316 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.198283 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c"] Apr 17 15:18:15.198316 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.198312 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz"] Apr 17 15:18:15.198496 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.198425 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" Apr 17 15:18:15.200924 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.200897 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 15:18:15.201908 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.201458 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 15:18:15.201908 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.201831 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-m5g7g\"" Apr 17 15:18:15.202695 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.202675 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 15:18:15.202938 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.202923 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 15:18:15.215911 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.215890 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz"] Apr 17 15:18:15.216003 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.215992 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.218385 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.218362 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 15:18:15.218385 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.218377 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 15:18:15.218531 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.218362 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 15:18:15.218531 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.218426 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 15:18:15.283891 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.283852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c\" (UID: \"ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" Apr 17 15:18:15.284103 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.283911 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-ca\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.284103 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.283941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.284103 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.283958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5wq2\" (UniqueName: \"kubernetes.io/projected/703b21b6-219e-47d9-859a-28871635be3d-kube-api-access-v5wq2\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.284103 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.283983 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-hub\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.284103 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.284012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/703b21b6-219e-47d9-859a-28871635be3d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.284103 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.284035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4vv9\" (UniqueName: \"kubernetes.io/projected/ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4-kube-api-access-j4vv9\") pod \"managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c\" (UID: \"ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" Apr 17 15:18:15.284299 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.284108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.385227 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.385190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.385227 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.385223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5wq2\" (UniqueName: \"kubernetes.io/projected/703b21b6-219e-47d9-859a-28871635be3d-kube-api-access-v5wq2\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.385436 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.385252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-hub\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.385482 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.385454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/703b21b6-219e-47d9-859a-28871635be3d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.385522 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.385508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4vv9\" (UniqueName: \"kubernetes.io/projected/ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4-kube-api-access-j4vv9\") pod \"managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c\" (UID: \"ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" Apr 17 15:18:15.385583 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.385565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.385640 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.385621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c\" (UID: \"ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" Apr 17 15:18:15.385697 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.385678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-ca\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.386252 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.386208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/703b21b6-219e-47d9-859a-28871635be3d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.389065 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.389027 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-hub\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.389225 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.389207 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.389453 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.389431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c\" (UID: \"ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" Apr 17 15:18:15.389511 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.389439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.392294 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.392274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4vv9\" (UniqueName: \"kubernetes.io/projected/ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4-kube-api-access-j4vv9\") pod \"managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c\" (UID: \"ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" Apr 17 15:18:15.392411 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.392355 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5wq2\" (UniqueName: \"kubernetes.io/projected/703b21b6-219e-47d9-859a-28871635be3d-kube-api-access-v5wq2\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.400515 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.400492 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/703b21b6-219e-47d9-859a-28871635be3d-ca\") pod \"cluster-proxy-proxy-agent-66ff9f64b6-qbbfz\" (UID: \"703b21b6-219e-47d9-859a-28871635be3d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.516693 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.516659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" Apr 17 15:18:15.523504 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.523402 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:18:15.636370 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.636330 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c"] Apr 17 15:18:15.640088 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:18:15.640033 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee94ec78_88d2_4eb6_a3e8_1c8cb6470ad4.slice/crio-84b0531769668a0e3fc48452b231a47f32392bf77c36903acb14cd3162f22d54 WatchSource:0}: Error finding container 84b0531769668a0e3fc48452b231a47f32392bf77c36903acb14cd3162f22d54: Status 404 returned error can't find the container with id 84b0531769668a0e3fc48452b231a47f32392bf77c36903acb14cd3162f22d54 Apr 17 15:18:15.654430 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:15.654407 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz"] Apr 17 15:18:15.657313 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:18:15.657287 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod703b21b6_219e_47d9_859a_28871635be3d.slice/crio-208d5945f396efa932fa9e57ec5a34282356bd27ddb8b688b0d8db158dfaef6f WatchSource:0}: Error finding container 208d5945f396efa932fa9e57ec5a34282356bd27ddb8b688b0d8db158dfaef6f: Status 404 returned error can't find the container with id 208d5945f396efa932fa9e57ec5a34282356bd27ddb8b688b0d8db158dfaef6f Apr 17 15:18:16.332721 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:16.332684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" event={"ID":"703b21b6-219e-47d9-859a-28871635be3d","Type":"ContainerStarted","Data":"208d5945f396efa932fa9e57ec5a34282356bd27ddb8b688b0d8db158dfaef6f"} Apr 17 15:18:16.333583 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:16.333563 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" event={"ID":"ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4","Type":"ContainerStarted","Data":"84b0531769668a0e3fc48452b231a47f32392bf77c36903acb14cd3162f22d54"} Apr 17 15:18:20.344205 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:20.344171 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" event={"ID":"703b21b6-219e-47d9-859a-28871635be3d","Type":"ContainerStarted","Data":"3b24ff5cff49eb7b1282c46bdcaa2739e39a639401aaebf600da0af6e946a541"} Apr 17 15:18:20.345411 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:20.345387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" event={"ID":"ee94ec78-88d2-4eb6-a3e8-1c8cb6470ad4","Type":"ContainerStarted","Data":"dbc80efd3cc798491033262458e34fb31a2f008a4e281fa542e265c712a8d743"} Apr 17 15:18:20.358813 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:20.358697 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75fc7f7cf7-v5j6c" podStartSLOduration=1.231518368 podStartE2EDuration="5.358681283s" podCreationTimestamp="2026-04-17 15:18:15 +0000 UTC" firstStartedPulling="2026-04-17 15:18:15.641719718 +0000 UTC m=+52.097901791" lastFinishedPulling="2026-04-17 15:18:19.768882633 +0000 UTC m=+56.225064706" observedRunningTime="2026-04-17 15:18:20.358276488 +0000 UTC m=+56.814458583" watchObservedRunningTime="2026-04-17 15:18:20.358681283 +0000 UTC m=+56.814863380" Apr 17 15:18:23.295208 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:23.295180 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7p8ns" Apr 17 15:18:25.359828 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:25.359786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" event={"ID":"703b21b6-219e-47d9-859a-28871635be3d","Type":"ContainerStarted","Data":"1e3ce9491a8f3280b063d4231341462d1ba0c9600056dac8dffe43ce4e93eba3"} Apr 17 15:18:25.360208 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:25.359836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" event={"ID":"703b21b6-219e-47d9-859a-28871635be3d","Type":"ContainerStarted","Data":"8aa67c9c1a5c4ce36dfeab37ae5dff857ef05c5f6c722a1b6fbdd59260a5f805"} Apr 17 15:18:25.376106 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:25.376041 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" podStartSLOduration=1.5856807179999999 podStartE2EDuration="10.376027822s" podCreationTimestamp="2026-04-17 15:18:15 +0000 UTC" firstStartedPulling="2026-04-17 15:18:15.658977671 +0000 UTC m=+52.115159744" lastFinishedPulling="2026-04-17 15:18:24.449324757 +0000 UTC m=+60.905506848" observedRunningTime="2026-04-17 15:18:25.375156584 +0000 UTC m=+61.831338684" watchObservedRunningTime="2026-04-17 15:18:25.376027822 +0000 UTC m=+61.832209969" Apr 17 15:18:28.785972 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:28.785928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:18:28.786427 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:28.786124 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:18:28.786427 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:28.786193 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls podName:37750dd6-91d9-43c3-a3ac-75a2c4ce6eec nodeName:}" failed. No retries permitted until 2026-04-17 15:19:00.786176509 +0000 UTC m=+97.242358582 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls") pod "dns-default-dw8kz" (UID: "37750dd6-91d9-43c3-a3ac-75a2c4ce6eec") : secret "dns-default-metrics-tls" not found Apr 17 15:18:28.886405 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:28.886371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:18:28.886560 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:28.886510 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:18:28.886597 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:28.886586 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert podName:b80c4e77-d795-4111-a247-f612ad85f926 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:00.886571217 +0000 UTC m=+97.342753290 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert") pod "ingress-canary-4g86w" (UID: "b80c4e77-d795-4111-a247-f612ad85f926") : secret "canary-serving-cert" not found Apr 17 15:18:29.893848 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:29.893752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:18:29.893848 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:29.893847 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 15:18:29.894311 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:18:29.893917 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs podName:41bb8b03-e874-455f-8416-b76d91f0f117 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:33.893897957 +0000 UTC m=+130.350080047 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs") pod "network-metrics-daemon-82cq8" (UID: "41bb8b03-e874-455f-8416-b76d91f0f117") : secret "metrics-daemon-secret" not found Apr 17 15:18:34.309670 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:18:34.309638 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-n2x89" Apr 17 15:19:00.811261 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:00.811113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:19:00.811702 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:19:00.811274 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 15:19:00.811702 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:19:00.811368 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls podName:37750dd6-91d9-43c3-a3ac-75a2c4ce6eec nodeName:}" failed. No retries permitted until 2026-04-17 15:20:04.8113483 +0000 UTC m=+161.267530373 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls") pod "dns-default-dw8kz" (UID: "37750dd6-91d9-43c3-a3ac-75a2c4ce6eec") : secret "dns-default-metrics-tls" not found Apr 17 15:19:00.912425 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:00.912371 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:19:00.912599 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:19:00.912515 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 15:19:00.912599 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:19:00.912583 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert podName:b80c4e77-d795-4111-a247-f612ad85f926 nodeName:}" failed. No retries permitted until 2026-04-17 15:20:04.91256918 +0000 UTC m=+161.368751257 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert") pod "ingress-canary-4g86w" (UID: "b80c4e77-d795-4111-a247-f612ad85f926") : secret "canary-serving-cert" not found Apr 17 15:19:14.568453 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:14.568421 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bpw5n_e70e8fd7-f8f2-4303-8371-1696921c6746/dns-node-resolver/0.log" Apr 17 15:19:15.768607 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:15.768578 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wqrkq_a00974bb-abc9-4285-909c-842f9c69b1f3/node-ca/0.log" Apr 17 15:19:33.953578 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:33.953534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:19:33.953981 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:19:33.953697 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 15:19:33.953981 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:19:33.953777 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs podName:41bb8b03-e874-455f-8416-b76d91f0f117 nodeName:}" failed. No retries permitted until 2026-04-17 15:21:35.95375715 +0000 UTC m=+252.409939225 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs") pod "network-metrics-daemon-82cq8" (UID: "41bb8b03-e874-455f-8416-b76d91f0f117") : secret "metrics-daemon-secret" not found Apr 17 15:19:35.524540 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:35.524481 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" podUID="703b21b6-219e-47d9-859a-28871635be3d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 15:19:45.524468 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:45.524429 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" podUID="703b21b6-219e-47d9-859a-28871635be3d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 15:19:46.628637 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.628598 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xs229"] Apr 17 15:19:46.631502 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.631485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.634824 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.634801 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 15:19:46.634927 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.634860 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-ctp72\"" Apr 17 15:19:46.634927 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.634861 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 15:19:46.635072 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.635040 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 15:19:46.635144 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.635108 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 15:19:46.642276 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.642252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xs229"] Apr 17 15:19:46.743228 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.743196 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2fbe12c0-d02c-498e-96a3-2d9911087940-crio-socket\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.743409 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.743245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fbe12c0-d02c-498e-96a3-2d9911087940-data-volume\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.743409 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.743299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd2zw\" (UniqueName: \"kubernetes.io/projected/2fbe12c0-d02c-498e-96a3-2d9911087940-kube-api-access-dd2zw\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.743409 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.743396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2fbe12c0-d02c-498e-96a3-2d9911087940-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.743566 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.743458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2fbe12c0-d02c-498e-96a3-2d9911087940-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.844577 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.844537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2fbe12c0-d02c-498e-96a3-2d9911087940-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.844737 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.844599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2fbe12c0-d02c-498e-96a3-2d9911087940-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.844737 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.844633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2fbe12c0-d02c-498e-96a3-2d9911087940-crio-socket\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.844737 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.844676 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fbe12c0-d02c-498e-96a3-2d9911087940-data-volume\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.844889 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.844751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2fbe12c0-d02c-498e-96a3-2d9911087940-crio-socket\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.844889 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.844795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd2zw\" (UniqueName: \"kubernetes.io/projected/2fbe12c0-d02c-498e-96a3-2d9911087940-kube-api-access-dd2zw\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.844993 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.844977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2fbe12c0-d02c-498e-96a3-2d9911087940-data-volume\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.845149 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.845131 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2fbe12c0-d02c-498e-96a3-2d9911087940-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.847089 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.847064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2fbe12c0-d02c-498e-96a3-2d9911087940-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.857996 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.857966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd2zw\" (UniqueName: \"kubernetes.io/projected/2fbe12c0-d02c-498e-96a3-2d9911087940-kube-api-access-dd2zw\") pod \"insights-runtime-extractor-xs229\" (UID: \"2fbe12c0-d02c-498e-96a3-2d9911087940\") " pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:46.940310 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:46.940215 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xs229" Apr 17 15:19:47.056754 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:47.056722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xs229"] Apr 17 15:19:47.059725 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:19:47.059693 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fbe12c0_d02c_498e_96a3_2d9911087940.slice/crio-e6b6239fd2101dffd78eb77c12b05927fced7467285b603063827cfc32df6123 WatchSource:0}: Error finding container e6b6239fd2101dffd78eb77c12b05927fced7467285b603063827cfc32df6123: Status 404 returned error can't find the container with id e6b6239fd2101dffd78eb77c12b05927fced7467285b603063827cfc32df6123 Apr 17 15:19:47.546851 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:47.546818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs229" event={"ID":"2fbe12c0-d02c-498e-96a3-2d9911087940","Type":"ContainerStarted","Data":"8516bdba2167cd24d2fed2b0a5d69eef6b1592d63b15167491d2390cc5571ffa"} Apr 17 15:19:47.546851 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:47.546853 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs229" event={"ID":"2fbe12c0-d02c-498e-96a3-2d9911087940","Type":"ContainerStarted","Data":"e6b6239fd2101dffd78eb77c12b05927fced7467285b603063827cfc32df6123"} Apr 17 15:19:48.551424 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:48.551390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs229" event={"ID":"2fbe12c0-d02c-498e-96a3-2d9911087940","Type":"ContainerStarted","Data":"cb4b1f1c71e4ebd2377004cf34f9b7cbc64740615c1dbc38835ceca21672cef7"} Apr 17 15:19:49.554962 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:49.554928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xs229" event={"ID":"2fbe12c0-d02c-498e-96a3-2d9911087940","Type":"ContainerStarted","Data":"1e0e75695f0a757eb39cc22d5f9d69c7d0f149393f7f75ab0085e9daace05472"} Apr 17 15:19:49.573450 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:49.573401 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xs229" podStartSLOduration=1.551556669 podStartE2EDuration="3.573387116s" podCreationTimestamp="2026-04-17 15:19:46 +0000 UTC" firstStartedPulling="2026-04-17 15:19:47.127550425 +0000 UTC m=+143.583732501" lastFinishedPulling="2026-04-17 15:19:49.149380875 +0000 UTC m=+145.605562948" observedRunningTime="2026-04-17 15:19:49.572705024 +0000 UTC m=+146.028887120" watchObservedRunningTime="2026-04-17 15:19:49.573387116 +0000 UTC m=+146.029569210" Apr 17 15:19:50.190947 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.190914 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-w4mf6"] Apr 17 15:19:50.193843 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.193820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.197267 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.197246 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 15:19:50.197613 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.197596 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-cf5fh\"" Apr 17 15:19:50.197712 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.197615 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 15:19:50.197712 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.197629 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 15:19:50.197712 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.197687 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 15:19:50.197867 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.197687 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 15:19:50.202779 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.202754 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-w4mf6"] Apr 17 15:19:50.374855 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.374807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a5e95c4-7657-40ac-9a51-cd8077d947ec-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.374855 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.374856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9pwr\" (UniqueName: \"kubernetes.io/projected/3a5e95c4-7657-40ac-9a51-cd8077d947ec-kube-api-access-w9pwr\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.375166 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.374958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a5e95c4-7657-40ac-9a51-cd8077d947ec-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.375166 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.375080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5e95c4-7657-40ac-9a51-cd8077d947ec-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.475648 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.475603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a5e95c4-7657-40ac-9a51-cd8077d947ec-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.475648 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.475653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9pwr\" (UniqueName: \"kubernetes.io/projected/3a5e95c4-7657-40ac-9a51-cd8077d947ec-kube-api-access-w9pwr\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.475974 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.475705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a5e95c4-7657-40ac-9a51-cd8077d947ec-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.475974 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.475740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5e95c4-7657-40ac-9a51-cd8077d947ec-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.476463 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.476442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a5e95c4-7657-40ac-9a51-cd8077d947ec-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.478182 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.478158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3a5e95c4-7657-40ac-9a51-cd8077d947ec-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.478270 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.478161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a5e95c4-7657-40ac-9a51-cd8077d947ec-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.483247 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.483226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9pwr\" (UniqueName: \"kubernetes.io/projected/3a5e95c4-7657-40ac-9a51-cd8077d947ec-kube-api-access-w9pwr\") pod \"prometheus-operator-5676c8c784-w4mf6\" (UID: \"3a5e95c4-7657-40ac-9a51-cd8077d947ec\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.503190 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.503151 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" Apr 17 15:19:50.619339 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:50.619303 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-w4mf6"] Apr 17 15:19:50.622974 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:19:50.622944 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a5e95c4_7657_40ac_9a51_cd8077d947ec.slice/crio-0c7c3b5da5418738a4760e467f95f7230712fbe6b29949e372064e3e21bf4a6a WatchSource:0}: Error finding container 0c7c3b5da5418738a4760e467f95f7230712fbe6b29949e372064e3e21bf4a6a: Status 404 returned error can't find the container with id 0c7c3b5da5418738a4760e467f95f7230712fbe6b29949e372064e3e21bf4a6a Apr 17 15:19:51.566499 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:51.566459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" event={"ID":"3a5e95c4-7657-40ac-9a51-cd8077d947ec","Type":"ContainerStarted","Data":"0c7c3b5da5418738a4760e467f95f7230712fbe6b29949e372064e3e21bf4a6a"} Apr 17 15:19:52.569970 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:52.569935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" event={"ID":"3a5e95c4-7657-40ac-9a51-cd8077d947ec","Type":"ContainerStarted","Data":"4912698c0e203fa980ceeca4371680d5e25a84487685ae330fd52b0b3f627acb"} Apr 17 15:19:52.569970 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:52.569972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" event={"ID":"3a5e95c4-7657-40ac-9a51-cd8077d947ec","Type":"ContainerStarted","Data":"9989835ed010884e6bf42f250575e21e31545879c7ed9afd6d8c12365077b6f5"} Apr 17 15:19:52.585763 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:52.585709 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-w4mf6" podStartSLOduration=1.164809332 podStartE2EDuration="2.585691956s" podCreationTimestamp="2026-04-17 15:19:50 +0000 UTC" firstStartedPulling="2026-04-17 15:19:50.624926328 +0000 UTC m=+147.081108404" lastFinishedPulling="2026-04-17 15:19:52.045808951 +0000 UTC m=+148.501991028" observedRunningTime="2026-04-17 15:19:52.584745356 +0000 UTC m=+149.040927464" watchObservedRunningTime="2026-04-17 15:19:52.585691956 +0000 UTC m=+149.041874052" Apr 17 15:19:54.549209 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.549174 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dxplq"] Apr 17 15:19:54.552286 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.552262 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.554479 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.554461 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 15:19:54.554584 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.554541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 15:19:54.554584 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.554555 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-95j6v\"" Apr 17 15:19:54.554584 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.554573 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 15:19:54.710531 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.710497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-metrics-client-ca\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.710700 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.710547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-accelerators-collector-config\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.710700 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.710639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-sys\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.710700 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.710691 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-root\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.710813 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.710726 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.710813 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.710744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpw8s\" (UniqueName: \"kubernetes.io/projected/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-kube-api-access-rpw8s\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.710813 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.710797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-textfile\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.710933 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.710834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-tls\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.710933 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.710850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-wtmp\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811664 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-tls\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811664 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-wtmp\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811664 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-metrics-client-ca\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811905 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-accelerators-collector-config\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811905 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-sys\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811905 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-root\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811905 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-wtmp\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811905 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811905 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-root\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811905 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpw8s\" (UniqueName: \"kubernetes.io/projected/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-kube-api-access-rpw8s\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.811905 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-textfile\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.812310 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.811833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-sys\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.812310 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.812256 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-textfile\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.812416 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.812396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-metrics-client-ca\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.812529 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.812513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-accelerators-collector-config\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.813998 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.813979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-tls\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.814057 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.814034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.819605 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.819568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpw8s\" (UniqueName: \"kubernetes.io/projected/40471bd1-f59d-42eb-84db-1b5f51c4f3e8-kube-api-access-rpw8s\") pod \"node-exporter-dxplq\" (UID: \"40471bd1-f59d-42eb-84db-1b5f51c4f3e8\") " pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.861558 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:54.861523 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dxplq" Apr 17 15:19:54.869146 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:19:54.869114 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40471bd1_f59d_42eb_84db_1b5f51c4f3e8.slice/crio-856e367c5849fa94a9c27f68d0c60eaf1ccac706f2c5c3399cf04318fd9127e6 WatchSource:0}: Error finding container 856e367c5849fa94a9c27f68d0c60eaf1ccac706f2c5c3399cf04318fd9127e6: Status 404 returned error can't find the container with id 856e367c5849fa94a9c27f68d0c60eaf1ccac706f2c5c3399cf04318fd9127e6 Apr 17 15:19:55.524923 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:55.524812 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" podUID="703b21b6-219e-47d9-859a-28871635be3d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 15:19:55.524923 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:55.524894 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" Apr 17 15:19:55.525498 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:55.525460 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"1e3ce9491a8f3280b063d4231341462d1ba0c9600056dac8dffe43ce4e93eba3"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 15:19:55.525583 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:55.525533 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" podUID="703b21b6-219e-47d9-859a-28871635be3d" containerName="service-proxy" containerID="cri-o://1e3ce9491a8f3280b063d4231341462d1ba0c9600056dac8dffe43ce4e93eba3" gracePeriod=30 Apr 17 15:19:55.578721 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:55.578664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dxplq" event={"ID":"40471bd1-f59d-42eb-84db-1b5f51c4f3e8","Type":"ContainerStarted","Data":"856e367c5849fa94a9c27f68d0c60eaf1ccac706f2c5c3399cf04318fd9127e6"} Apr 17 15:19:56.583337 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:56.583299 2576 generic.go:358] "Generic (PLEG): container finished" podID="703b21b6-219e-47d9-859a-28871635be3d" containerID="1e3ce9491a8f3280b063d4231341462d1ba0c9600056dac8dffe43ce4e93eba3" exitCode=2 Apr 17 15:19:56.583337 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:56.583327 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" event={"ID":"703b21b6-219e-47d9-859a-28871635be3d","Type":"ContainerDied","Data":"1e3ce9491a8f3280b063d4231341462d1ba0c9600056dac8dffe43ce4e93eba3"} Apr 17 15:19:56.583847 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:56.583365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66ff9f64b6-qbbfz" event={"ID":"703b21b6-219e-47d9-859a-28871635be3d","Type":"ContainerStarted","Data":"bcf540cb730d931736645e0a04e8b477bd7b7e7aa8d46d7b39ab27f5d6e650b3"} Apr 17 15:19:56.584787 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:56.584767 2576 generic.go:358] "Generic (PLEG): container finished" podID="40471bd1-f59d-42eb-84db-1b5f51c4f3e8" containerID="1a5fb78e3ae38fb5cc18522c269dfa76b8902b1799475aa1b8f69e3f88112b86" exitCode=0 Apr 17 15:19:56.584894 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:56.584793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dxplq" event={"ID":"40471bd1-f59d-42eb-84db-1b5f51c4f3e8","Type":"ContainerDied","Data":"1a5fb78e3ae38fb5cc18522c269dfa76b8902b1799475aa1b8f69e3f88112b86"} Apr 17 15:19:57.589409 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:57.589366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dxplq" event={"ID":"40471bd1-f59d-42eb-84db-1b5f51c4f3e8","Type":"ContainerStarted","Data":"fda853156016802e8636976e5c5d4fe9095de859fec5c615a74566abe6d4be55"} Apr 17 15:19:57.589409 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:57.589406 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dxplq" event={"ID":"40471bd1-f59d-42eb-84db-1b5f51c4f3e8","Type":"ContainerStarted","Data":"242b59cf17a247c37de231194469a1bd32da296bb76518d968dec18d39257c44"} Apr 17 15:19:57.609910 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:57.609862 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dxplq" podStartSLOduration=2.730724735 podStartE2EDuration="3.609848691s" podCreationTimestamp="2026-04-17 15:19:54 +0000 UTC" firstStartedPulling="2026-04-17 15:19:54.871025573 +0000 UTC m=+151.327207646" lastFinishedPulling="2026-04-17 15:19:55.750149524 +0000 UTC m=+152.206331602" observedRunningTime="2026-04-17 15:19:57.608483859 +0000 UTC m=+154.064665967" watchObservedRunningTime="2026-04-17 15:19:57.609848691 +0000 UTC m=+154.066030787" Apr 17 15:19:59.299483 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:59.299444 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4"] Apr 17 15:19:59.302296 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:59.302273 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" Apr 17 15:19:59.304613 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:59.304588 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 15:19:59.304731 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:59.304643 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-xdgbp\"" Apr 17 15:19:59.308523 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:59.308501 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4"] Apr 17 15:19:59.345266 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:59.345232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/28a3e5c6-6810-4b5f-8dbf-cac934703031-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tw2s4\" (UID: \"28a3e5c6-6810-4b5f-8dbf-cac934703031\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" Apr 17 15:19:59.446409 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:59.446372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/28a3e5c6-6810-4b5f-8dbf-cac934703031-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tw2s4\" (UID: \"28a3e5c6-6810-4b5f-8dbf-cac934703031\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" Apr 17 15:19:59.446607 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:19:59.446535 2576 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 15:19:59.446681 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:19:59.446618 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28a3e5c6-6810-4b5f-8dbf-cac934703031-monitoring-plugin-cert podName:28a3e5c6-6810-4b5f-8dbf-cac934703031 nodeName:}" failed. No retries permitted until 2026-04-17 15:19:59.946595568 +0000 UTC m=+156.402777656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/28a3e5c6-6810-4b5f-8dbf-cac934703031-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-tw2s4" (UID: "28a3e5c6-6810-4b5f-8dbf-cac934703031") : secret "monitoring-plugin-cert" not found Apr 17 15:19:59.950301 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:59.950241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/28a3e5c6-6810-4b5f-8dbf-cac934703031-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tw2s4\" (UID: \"28a3e5c6-6810-4b5f-8dbf-cac934703031\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" Apr 17 15:19:59.952734 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:19:59.952706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/28a3e5c6-6810-4b5f-8dbf-cac934703031-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tw2s4\" (UID: \"28a3e5c6-6810-4b5f-8dbf-cac934703031\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" Apr 17 15:19:59.981904 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:19:59.981857 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-dw8kz" podUID="37750dd6-91d9-43c3-a3ac-75a2c4ce6eec" Apr 17 15:19:59.994026 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:19:59.993987 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4g86w" podUID="b80c4e77-d795-4111-a247-f612ad85f926" Apr 17 15:20:00.150206 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:20:00.150163 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-82cq8" podUID="41bb8b03-e874-455f-8416-b76d91f0f117" Apr 17 15:20:00.212816 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:00.212783 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" Apr 17 15:20:00.342267 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:00.342233 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4"] Apr 17 15:20:00.345561 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:20:00.345531 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28a3e5c6_6810_4b5f_8dbf_cac934703031.slice/crio-742ff8ceb00663a4b76b1ab7c97c390f2f439f641006a5c23ee9285837ff4c79 WatchSource:0}: Error finding container 742ff8ceb00663a4b76b1ab7c97c390f2f439f641006a5c23ee9285837ff4c79: Status 404 returned error can't find the container with id 742ff8ceb00663a4b76b1ab7c97c390f2f439f641006a5c23ee9285837ff4c79 Apr 17 15:20:00.598754 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:00.598672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" event={"ID":"28a3e5c6-6810-4b5f-8dbf-cac934703031","Type":"ContainerStarted","Data":"742ff8ceb00663a4b76b1ab7c97c390f2f439f641006a5c23ee9285837ff4c79"} Apr 17 15:20:00.598754 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:00.598705 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dw8kz" Apr 17 15:20:02.606262 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:02.606225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" event={"ID":"28a3e5c6-6810-4b5f-8dbf-cac934703031","Type":"ContainerStarted","Data":"1bef424649401f460cc257e6ff36abeddc340039507cb3de5f22489648595e0f"} Apr 17 15:20:02.606699 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:02.606416 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" Apr 17 15:20:02.610884 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:02.610863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" Apr 17 15:20:02.619794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:02.619749 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tw2s4" podStartSLOduration=2.358697409 podStartE2EDuration="3.619733347s" podCreationTimestamp="2026-04-17 15:19:59 +0000 UTC" firstStartedPulling="2026-04-17 15:20:00.347419484 +0000 UTC m=+156.803601562" lastFinishedPulling="2026-04-17 15:20:01.608455417 +0000 UTC m=+158.064637500" observedRunningTime="2026-04-17 15:20:02.619322199 +0000 UTC m=+159.075504294" watchObservedRunningTime="2026-04-17 15:20:02.619733347 +0000 UTC m=+159.075915444" Apr 17 15:20:04.885278 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:04.885238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:20:04.887646 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:04.887622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37750dd6-91d9-43c3-a3ac-75a2c4ce6eec-metrics-tls\") pod \"dns-default-dw8kz\" (UID: \"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec\") " pod="openshift-dns/dns-default-dw8kz" Apr 17 15:20:04.986337 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:04.985762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:20:04.989114 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:04.989081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b80c4e77-d795-4111-a247-f612ad85f926-cert\") pod \"ingress-canary-4g86w\" (UID: \"b80c4e77-d795-4111-a247-f612ad85f926\") " pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:20:05.101559 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:05.101519 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p8skx\"" Apr 17 15:20:05.110437 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:05.110409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dw8kz" Apr 17 15:20:05.226515 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:05.226486 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dw8kz"] Apr 17 15:20:05.230360 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:20:05.230330 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37750dd6_91d9_43c3_a3ac_75a2c4ce6eec.slice/crio-515f3680e63befe17d331a46e82beac78ec089bf3c287dcb29119c4126483263 WatchSource:0}: Error finding container 515f3680e63befe17d331a46e82beac78ec089bf3c287dcb29119c4126483263: Status 404 returned error can't find the container with id 515f3680e63befe17d331a46e82beac78ec089bf3c287dcb29119c4126483263 Apr 17 15:20:05.615064 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:05.615016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dw8kz" event={"ID":"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec","Type":"ContainerStarted","Data":"515f3680e63befe17d331a46e82beac78ec089bf3c287dcb29119c4126483263"} Apr 17 15:20:07.622268 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:07.622229 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dw8kz" event={"ID":"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec","Type":"ContainerStarted","Data":"a0b04be91371b203c001e7bd9931020a12121dc8e31db64d202ef50840285338"} Apr 17 15:20:07.622268 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:07.622265 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dw8kz" event={"ID":"37750dd6-91d9-43c3-a3ac-75a2c4ce6eec","Type":"ContainerStarted","Data":"1a4e4902048c74b94294e280738989536e2f5160422d884a7f88d2f4a41c984b"} Apr 17 15:20:07.622703 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:07.622356 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dw8kz" Apr 17 15:20:07.638781 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:07.638733 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dw8kz" podStartSLOduration=130.338346754 podStartE2EDuration="2m11.638719354s" podCreationTimestamp="2026-04-17 15:17:56 +0000 UTC" firstStartedPulling="2026-04-17 15:20:05.232513952 +0000 UTC m=+161.688696025" lastFinishedPulling="2026-04-17 15:20:06.532886551 +0000 UTC m=+162.989068625" observedRunningTime="2026-04-17 15:20:07.637225358 +0000 UTC m=+164.093407453" watchObservedRunningTime="2026-04-17 15:20:07.638719354 +0000 UTC m=+164.094901449" Apr 17 15:20:13.759797 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:13.759763 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-xt8cx"] Apr 17 15:20:13.765072 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:13.765033 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-xt8cx" Apr 17 15:20:13.767313 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:13.767289 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 15:20:13.767659 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:13.767645 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 15:20:13.767697 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:13.767652 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-phbqw\"" Apr 17 15:20:13.770218 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:13.770189 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-xt8cx"] Apr 17 15:20:13.847370 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:13.847334 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psqcx\" (UniqueName: \"kubernetes.io/projected/90902de8-0523-47d2-bac0-8249a7985ae8-kube-api-access-psqcx\") pod \"downloads-6bcc868b7-xt8cx\" (UID: \"90902de8-0523-47d2-bac0-8249a7985ae8\") " pod="openshift-console/downloads-6bcc868b7-xt8cx" Apr 17 15:20:13.948605 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:13.948558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psqcx\" (UniqueName: \"kubernetes.io/projected/90902de8-0523-47d2-bac0-8249a7985ae8-kube-api-access-psqcx\") pod \"downloads-6bcc868b7-xt8cx\" (UID: \"90902de8-0523-47d2-bac0-8249a7985ae8\") " pod="openshift-console/downloads-6bcc868b7-xt8cx" Apr 17 15:20:13.955855 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:13.955822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psqcx\" (UniqueName: \"kubernetes.io/projected/90902de8-0523-47d2-bac0-8249a7985ae8-kube-api-access-psqcx\") pod \"downloads-6bcc868b7-xt8cx\" (UID: \"90902de8-0523-47d2-bac0-8249a7985ae8\") " pod="openshift-console/downloads-6bcc868b7-xt8cx" Apr 17 15:20:14.075039 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:14.074949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-xt8cx" Apr 17 15:20:14.142238 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:14.141898 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:20:14.142238 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:14.142103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:20:14.144799 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:14.144700 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wwl79\"" Apr 17 15:20:14.152840 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:14.152667 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4g86w" Apr 17 15:20:14.197833 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:14.197796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-xt8cx"] Apr 17 15:20:14.203315 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:20:14.203263 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90902de8_0523_47d2_bac0_8249a7985ae8.slice/crio-11219c33243757d3c881def4a77f8929d20fd8fd6b46387eec5eb7f2dbf08c2d WatchSource:0}: Error finding container 11219c33243757d3c881def4a77f8929d20fd8fd6b46387eec5eb7f2dbf08c2d: Status 404 returned error can't find the container with id 11219c33243757d3c881def4a77f8929d20fd8fd6b46387eec5eb7f2dbf08c2d Apr 17 15:20:14.271432 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:14.271401 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4g86w"] Apr 17 15:20:14.275442 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:20:14.275419 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb80c4e77_d795_4111_a247_f612ad85f926.slice/crio-5b5df879bfd646910b408084d4243a8442a716c9bf46306ffeced223146ad7c7 WatchSource:0}: Error finding container 5b5df879bfd646910b408084d4243a8442a716c9bf46306ffeced223146ad7c7: Status 404 returned error can't find the container with id 5b5df879bfd646910b408084d4243a8442a716c9bf46306ffeced223146ad7c7 Apr 17 15:20:14.638367 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:14.638328 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4g86w" event={"ID":"b80c4e77-d795-4111-a247-f612ad85f926","Type":"ContainerStarted","Data":"5b5df879bfd646910b408084d4243a8442a716c9bf46306ffeced223146ad7c7"} Apr 17 15:20:14.639281 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:14.639256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-xt8cx" event={"ID":"90902de8-0523-47d2-bac0-8249a7985ae8","Type":"ContainerStarted","Data":"11219c33243757d3c881def4a77f8929d20fd8fd6b46387eec5eb7f2dbf08c2d"} Apr 17 15:20:16.650113 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:16.650074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4g86w" event={"ID":"b80c4e77-d795-4111-a247-f612ad85f926","Type":"ContainerStarted","Data":"60d3f1019a2504c168a4f98bb56f21fd1591ba0aa132831a2c9e4258b79203c6"} Apr 17 15:20:16.666368 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:16.666319 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4g86w" podStartSLOduration=138.795965899 podStartE2EDuration="2m20.666304s" podCreationTimestamp="2026-04-17 15:17:56 +0000 UTC" firstStartedPulling="2026-04-17 15:20:14.277196838 +0000 UTC m=+170.733378911" lastFinishedPulling="2026-04-17 15:20:16.147534923 +0000 UTC m=+172.603717012" observedRunningTime="2026-04-17 15:20:16.666150896 +0000 UTC m=+173.122332988" watchObservedRunningTime="2026-04-17 15:20:16.666304 +0000 UTC m=+173.122486095" Apr 17 15:20:17.626831 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:17.626794 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dw8kz" Apr 17 15:20:30.694465 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:30.694424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-xt8cx" event={"ID":"90902de8-0523-47d2-bac0-8249a7985ae8","Type":"ContainerStarted","Data":"6bd49854dd7dda777dc8f571067206af6c5c8106fbf1bba2380469d85a428ce7"} Apr 17 15:20:30.694931 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:30.694621 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-xt8cx" Apr 17 15:20:30.709611 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:30.709581 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-xt8cx" Apr 17 15:20:30.712739 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:30.712690 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-xt8cx" podStartSLOduration=1.9276098780000002 podStartE2EDuration="17.712674083s" podCreationTimestamp="2026-04-17 15:20:13 +0000 UTC" firstStartedPulling="2026-04-17 15:20:14.205302597 +0000 UTC m=+170.661484670" lastFinishedPulling="2026-04-17 15:20:29.990366802 +0000 UTC m=+186.446548875" observedRunningTime="2026-04-17 15:20:30.710606437 +0000 UTC m=+187.166788558" watchObservedRunningTime="2026-04-17 15:20:30.712674083 +0000 UTC m=+187.168856179" Apr 17 15:20:35.604430 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.604395 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-tw2s4_28a3e5c6-6810-4b5f-8dbf-cac934703031/monitoring-plugin/0.log" Apr 17 15:20:35.660293 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.660254 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5b7cb5c8fc-lnw7t"] Apr 17 15:20:35.677198 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.677123 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b7cb5c8fc-lnw7t"] Apr 17 15:20:35.677375 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.677257 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.680581 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.680555 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 15:20:35.680770 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.680555 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 15:20:35.680947 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.680584 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-w8shx\"" Apr 17 15:20:35.681120 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.680693 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 15:20:35.681604 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.681586 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 15:20:35.681927 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.681904 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 15:20:35.686384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.686215 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 15:20:35.846180 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.846141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-oauth-config\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.846369 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.846195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-service-ca\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.846369 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.846224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-console-config\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.846369 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.846301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-serving-cert\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.846369 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.846330 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-trusted-ca-bundle\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.846369 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.846354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-oauth-serving-cert\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.846369 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.846370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x68s\" (UniqueName: \"kubernetes.io/projected/841e27df-ba77-45b4-a8dd-d236e106ccac-kube-api-access-2x68s\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.947355 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.947255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-oauth-config\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.947355 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.947323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-service-ca\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.947594 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.947358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-console-config\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.947594 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.947399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-serving-cert\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.947594 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.947423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-trusted-ca-bundle\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.947594 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.947444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-oauth-serving-cert\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.947594 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.947471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x68s\" (UniqueName: \"kubernetes.io/projected/841e27df-ba77-45b4-a8dd-d236e106ccac-kube-api-access-2x68s\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.948160 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.948134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-service-ca\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.948277 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.948198 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-console-config\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.948277 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.948239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-oauth-serving-cert\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.948404 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.948319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-trusted-ca-bundle\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.956147 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.956114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x68s\" (UniqueName: \"kubernetes.io/projected/841e27df-ba77-45b4-a8dd-d236e106ccac-kube-api-access-2x68s\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.957038 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.957018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-serving-cert\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.957038 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.957028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-oauth-config\") pod \"console-5b7cb5c8fc-lnw7t\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:35.988794 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:35.988756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:36.124477 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:36.124445 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5b7cb5c8fc-lnw7t"] Apr 17 15:20:36.129412 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:20:36.129375 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod841e27df_ba77_45b4_a8dd_d236e106ccac.slice/crio-dbfd4988c616aed73f95a53afce1e38de4e416a545344573d0142d051d7a0ddf WatchSource:0}: Error finding container dbfd4988c616aed73f95a53afce1e38de4e416a545344573d0142d051d7a0ddf: Status 404 returned error can't find the container with id dbfd4988c616aed73f95a53afce1e38de4e416a545344573d0142d051d7a0ddf Apr 17 15:20:36.406335 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:36.406301 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dxplq_40471bd1-f59d-42eb-84db-1b5f51c4f3e8/init-textfile/0.log" Apr 17 15:20:36.605127 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:36.605091 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dxplq_40471bd1-f59d-42eb-84db-1b5f51c4f3e8/node-exporter/0.log" Apr 17 15:20:36.711819 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:36.711767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b7cb5c8fc-lnw7t" event={"ID":"841e27df-ba77-45b4-a8dd-d236e106ccac","Type":"ContainerStarted","Data":"dbfd4988c616aed73f95a53afce1e38de4e416a545344573d0142d051d7a0ddf"} Apr 17 15:20:36.804653 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:36.804623 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dxplq_40471bd1-f59d-42eb-84db-1b5f51c4f3e8/kube-rbac-proxy/0.log" Apr 17 15:20:39.606581 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:39.606543 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-w4mf6_3a5e95c4-7657-40ac-9a51-cd8077d947ec/prometheus-operator/0.log" Apr 17 15:20:39.804204 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:39.804178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-w4mf6_3a5e95c4-7657-40ac-9a51-cd8077d947ec/kube-rbac-proxy/0.log" Apr 17 15:20:40.724881 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:40.724845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b7cb5c8fc-lnw7t" event={"ID":"841e27df-ba77-45b4-a8dd-d236e106ccac","Type":"ContainerStarted","Data":"5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6"} Apr 17 15:20:40.742912 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:40.742860 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5b7cb5c8fc-lnw7t" podStartSLOduration=2.189450913 podStartE2EDuration="5.742845579s" podCreationTimestamp="2026-04-17 15:20:35 +0000 UTC" firstStartedPulling="2026-04-17 15:20:36.131741587 +0000 UTC m=+192.587923665" lastFinishedPulling="2026-04-17 15:20:39.685136255 +0000 UTC m=+196.141318331" observedRunningTime="2026-04-17 15:20:40.741757499 +0000 UTC m=+197.197939598" watchObservedRunningTime="2026-04-17 15:20:40.742845579 +0000 UTC m=+197.199027674" Apr 17 15:20:42.807894 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:42.807861 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-xt8cx_90902de8-0523-47d2-bac0-8249a7985ae8/download-server/0.log" Apr 17 15:20:45.990030 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:45.989318 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:45.990030 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:45.989378 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:45.996977 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:45.996952 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:20:46.747822 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:20:46.747795 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:21:24.932965 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:24.932931 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b7cb5c8fc-lnw7t"] Apr 17 15:21:36.019614 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:36.019571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:21:36.021957 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:36.021934 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41bb8b03-e874-455f-8416-b76d91f0f117-metrics-certs\") pod \"network-metrics-daemon-82cq8\" (UID: \"41bb8b03-e874-455f-8416-b76d91f0f117\") " pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:21:36.045512 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:36.045484 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9m884\"" Apr 17 15:21:36.053546 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:36.053526 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82cq8" Apr 17 15:21:36.169176 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:36.169146 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-82cq8"] Apr 17 15:21:36.172117 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:21:36.172078 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41bb8b03_e874_455f_8416_b76d91f0f117.slice/crio-fd8bcf404a0308910312e973226bc43b0eaf0a0e0cbfb157b81a4c811f587abc WatchSource:0}: Error finding container fd8bcf404a0308910312e973226bc43b0eaf0a0e0cbfb157b81a4c811f587abc: Status 404 returned error can't find the container with id fd8bcf404a0308910312e973226bc43b0eaf0a0e0cbfb157b81a4c811f587abc Apr 17 15:21:36.893792 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:36.893754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-82cq8" event={"ID":"41bb8b03-e874-455f-8416-b76d91f0f117","Type":"ContainerStarted","Data":"fd8bcf404a0308910312e973226bc43b0eaf0a0e0cbfb157b81a4c811f587abc"} Apr 17 15:21:37.898358 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:37.898318 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-82cq8" event={"ID":"41bb8b03-e874-455f-8416-b76d91f0f117","Type":"ContainerStarted","Data":"47e2a602575503f6030a3c54429e59a91fda7cae094cbd021b0dfe6978a3281f"} Apr 17 15:21:37.898358 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:37.898355 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-82cq8" event={"ID":"41bb8b03-e874-455f-8416-b76d91f0f117","Type":"ContainerStarted","Data":"c1d006b3ac9cd6a117006dd878b9b6e2fc07f632ce1df3d96eae06280cebdef7"} Apr 17 15:21:49.951222 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:49.951182 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5b7cb5c8fc-lnw7t" podUID="841e27df-ba77-45b4-a8dd-d236e106ccac" containerName="console" containerID="cri-o://5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6" gracePeriod=15 Apr 17 15:21:50.185302 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.185279 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b7cb5c8fc-lnw7t_841e27df-ba77-45b4-a8dd-d236e106ccac/console/0.log" Apr 17 15:21:50.185414 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.185350 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:21:50.206707 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.206662 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-82cq8" podStartSLOduration=265.071561168 podStartE2EDuration="4m26.206646854s" podCreationTimestamp="2026-04-17 15:17:24 +0000 UTC" firstStartedPulling="2026-04-17 15:21:36.173849912 +0000 UTC m=+252.630032002" lastFinishedPulling="2026-04-17 15:21:37.308935616 +0000 UTC m=+253.765117688" observedRunningTime="2026-04-17 15:21:37.914749848 +0000 UTC m=+254.370931946" watchObservedRunningTime="2026-04-17 15:21:50.206646854 +0000 UTC m=+266.662828949" Apr 17 15:21:50.325188 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325150 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x68s\" (UniqueName: \"kubernetes.io/projected/841e27df-ba77-45b4-a8dd-d236e106ccac-kube-api-access-2x68s\") pod \"841e27df-ba77-45b4-a8dd-d236e106ccac\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " Apr 17 15:21:50.325188 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325187 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-serving-cert\") pod \"841e27df-ba77-45b4-a8dd-d236e106ccac\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " Apr 17 15:21:50.325400 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325211 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-console-config\") pod \"841e27df-ba77-45b4-a8dd-d236e106ccac\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " Apr 17 15:21:50.325400 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325239 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-trusted-ca-bundle\") pod \"841e27df-ba77-45b4-a8dd-d236e106ccac\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " Apr 17 15:21:50.325400 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325367 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-oauth-serving-cert\") pod \"841e27df-ba77-45b4-a8dd-d236e106ccac\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " Apr 17 15:21:50.325545 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325440 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-service-ca\") pod \"841e27df-ba77-45b4-a8dd-d236e106ccac\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " Apr 17 15:21:50.325545 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325484 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-oauth-config\") pod \"841e27df-ba77-45b4-a8dd-d236e106ccac\" (UID: \"841e27df-ba77-45b4-a8dd-d236e106ccac\") " Apr 17 15:21:50.325745 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325699 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-console-config" (OuterVolumeSpecName: "console-config") pod "841e27df-ba77-45b4-a8dd-d236e106ccac" (UID: "841e27df-ba77-45b4-a8dd-d236e106ccac"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:50.325745 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325718 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "841e27df-ba77-45b4-a8dd-d236e106ccac" (UID: "841e27df-ba77-45b4-a8dd-d236e106ccac"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:50.325861 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325709 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "841e27df-ba77-45b4-a8dd-d236e106ccac" (UID: "841e27df-ba77-45b4-a8dd-d236e106ccac"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:50.325861 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.325820 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-service-ca" (OuterVolumeSpecName: "service-ca") pod "841e27df-ba77-45b4-a8dd-d236e106ccac" (UID: "841e27df-ba77-45b4-a8dd-d236e106ccac"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 15:21:50.327532 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.327503 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "841e27df-ba77-45b4-a8dd-d236e106ccac" (UID: "841e27df-ba77-45b4-a8dd-d236e106ccac"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:50.327648 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.327552 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "841e27df-ba77-45b4-a8dd-d236e106ccac" (UID: "841e27df-ba77-45b4-a8dd-d236e106ccac"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 15:21:50.327648 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.327605 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/841e27df-ba77-45b4-a8dd-d236e106ccac-kube-api-access-2x68s" (OuterVolumeSpecName: "kube-api-access-2x68s") pod "841e27df-ba77-45b4-a8dd-d236e106ccac" (UID: "841e27df-ba77-45b4-a8dd-d236e106ccac"). InnerVolumeSpecName "kube-api-access-2x68s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:21:50.426218 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.426187 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2x68s\" (UniqueName: \"kubernetes.io/projected/841e27df-ba77-45b4-a8dd-d236e106ccac-kube-api-access-2x68s\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:21:50.426218 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.426216 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-serving-cert\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:21:50.426218 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.426226 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-console-config\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:21:50.426427 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.426235 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-trusted-ca-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:21:50.426427 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.426244 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-oauth-serving-cert\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:21:50.426427 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.426253 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/841e27df-ba77-45b4-a8dd-d236e106ccac-service-ca\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:21:50.426427 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.426261 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/841e27df-ba77-45b4-a8dd-d236e106ccac-console-oauth-config\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:21:50.935933 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.935901 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5b7cb5c8fc-lnw7t_841e27df-ba77-45b4-a8dd-d236e106ccac/console/0.log" Apr 17 15:21:50.936125 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.935951 2576 generic.go:358] "Generic (PLEG): container finished" podID="841e27df-ba77-45b4-a8dd-d236e106ccac" containerID="5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6" exitCode=2 Apr 17 15:21:50.936125 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.935989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b7cb5c8fc-lnw7t" event={"ID":"841e27df-ba77-45b4-a8dd-d236e106ccac","Type":"ContainerDied","Data":"5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6"} Apr 17 15:21:50.936125 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.936012 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5b7cb5c8fc-lnw7t" Apr 17 15:21:50.936125 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.936018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5b7cb5c8fc-lnw7t" event={"ID":"841e27df-ba77-45b4-a8dd-d236e106ccac","Type":"ContainerDied","Data":"dbfd4988c616aed73f95a53afce1e38de4e416a545344573d0142d051d7a0ddf"} Apr 17 15:21:50.936125 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.936037 2576 scope.go:117] "RemoveContainer" containerID="5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6" Apr 17 15:21:50.944584 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.944564 2576 scope.go:117] "RemoveContainer" containerID="5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6" Apr 17 15:21:50.944877 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:21:50.944853 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6\": container with ID starting with 5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6 not found: ID does not exist" containerID="5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6" Apr 17 15:21:50.944937 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.944889 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6"} err="failed to get container status \"5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6\": rpc error: code = NotFound desc = could not find container \"5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6\": container with ID starting with 5aba0c4928e85491bb5eba5ddcb3b007b54d76660b260be138143ada9742b7b6 not found: ID does not exist" Apr 17 15:21:50.956361 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.956332 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5b7cb5c8fc-lnw7t"] Apr 17 15:21:50.965104 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:50.965081 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5b7cb5c8fc-lnw7t"] Apr 17 15:21:52.138523 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.138480 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="841e27df-ba77-45b4-a8dd-d236e106ccac" path="/var/lib/kubelet/pods/841e27df-ba77-45b4-a8dd-d236e106ccac/volumes" Apr 17 15:21:52.281738 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.281705 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7gp9s"] Apr 17 15:21:52.281937 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.281926 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="841e27df-ba77-45b4-a8dd-d236e106ccac" containerName="console" Apr 17 15:21:52.281974 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.281939 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="841e27df-ba77-45b4-a8dd-d236e106ccac" containerName="console" Apr 17 15:21:52.282007 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.281991 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="841e27df-ba77-45b4-a8dd-d236e106ccac" containerName="console" Apr 17 15:21:52.287328 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.287310 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.289578 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.289559 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 15:21:52.291748 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.291726 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7gp9s"] Apr 17 15:21:52.438990 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.438885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd2172f8-51a1-4324-8352-f33756fe0535-kubelet-config\") pod \"global-pull-secret-syncer-7gp9s\" (UID: \"fd2172f8-51a1-4324-8352-f33756fe0535\") " pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.439183 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.439002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd2172f8-51a1-4324-8352-f33756fe0535-original-pull-secret\") pod \"global-pull-secret-syncer-7gp9s\" (UID: \"fd2172f8-51a1-4324-8352-f33756fe0535\") " pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.439183 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.439036 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd2172f8-51a1-4324-8352-f33756fe0535-dbus\") pod \"global-pull-secret-syncer-7gp9s\" (UID: \"fd2172f8-51a1-4324-8352-f33756fe0535\") " pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.540062 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.540020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd2172f8-51a1-4324-8352-f33756fe0535-original-pull-secret\") pod \"global-pull-secret-syncer-7gp9s\" (UID: \"fd2172f8-51a1-4324-8352-f33756fe0535\") " pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.540263 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.540152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd2172f8-51a1-4324-8352-f33756fe0535-dbus\") pod \"global-pull-secret-syncer-7gp9s\" (UID: \"fd2172f8-51a1-4324-8352-f33756fe0535\") " pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.540263 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.540177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd2172f8-51a1-4324-8352-f33756fe0535-kubelet-config\") pod \"global-pull-secret-syncer-7gp9s\" (UID: \"fd2172f8-51a1-4324-8352-f33756fe0535\") " pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.540263 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.540248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd2172f8-51a1-4324-8352-f33756fe0535-kubelet-config\") pod \"global-pull-secret-syncer-7gp9s\" (UID: \"fd2172f8-51a1-4324-8352-f33756fe0535\") " pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.540420 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.540326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd2172f8-51a1-4324-8352-f33756fe0535-dbus\") pod \"global-pull-secret-syncer-7gp9s\" (UID: \"fd2172f8-51a1-4324-8352-f33756fe0535\") " pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.542381 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.542361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd2172f8-51a1-4324-8352-f33756fe0535-original-pull-secret\") pod \"global-pull-secret-syncer-7gp9s\" (UID: \"fd2172f8-51a1-4324-8352-f33756fe0535\") " pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.596556 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.596515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7gp9s" Apr 17 15:21:52.711876 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.711844 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7gp9s"] Apr 17 15:21:52.714570 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:21:52.714533 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd2172f8_51a1_4324_8352_f33756fe0535.slice/crio-eabf1ddff2207c2a8d4c69f97d3dd9adcb285ba1fc370810b2fb39b503d01b3e WatchSource:0}: Error finding container eabf1ddff2207c2a8d4c69f97d3dd9adcb285ba1fc370810b2fb39b503d01b3e: Status 404 returned error can't find the container with id eabf1ddff2207c2a8d4c69f97d3dd9adcb285ba1fc370810b2fb39b503d01b3e Apr 17 15:21:52.942743 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:52.942704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7gp9s" event={"ID":"fd2172f8-51a1-4324-8352-f33756fe0535","Type":"ContainerStarted","Data":"eabf1ddff2207c2a8d4c69f97d3dd9adcb285ba1fc370810b2fb39b503d01b3e"} Apr 17 15:21:57.958761 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:57.958724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7gp9s" event={"ID":"fd2172f8-51a1-4324-8352-f33756fe0535","Type":"ContainerStarted","Data":"d9c88e88457e062697235b77ed48163840a902d73965d0f8a682513a88b3f583"} Apr 17 15:21:57.971812 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:21:57.971764 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7gp9s" podStartSLOduration=1.157033763 podStartE2EDuration="5.971747796s" podCreationTimestamp="2026-04-17 15:21:52 +0000 UTC" firstStartedPulling="2026-04-17 15:21:52.716243048 +0000 UTC m=+269.172425125" lastFinishedPulling="2026-04-17 15:21:57.530957081 +0000 UTC m=+273.987139158" observedRunningTime="2026-04-17 15:21:57.971314838 +0000 UTC m=+274.427496932" watchObservedRunningTime="2026-04-17 15:21:57.971747796 +0000 UTC m=+274.427929941" Apr 17 15:22:24.006487 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:24.006454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:22:24.007334 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:24.007314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:22:24.012283 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:24.012252 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 15:22:57.573150 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.573119 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-rgfvc"] Apr 17 15:22:57.576312 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.576287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" Apr 17 15:22:57.578530 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.578504 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 15:22:57.578650 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.578594 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 15:22:57.579494 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.579479 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-2sshd\"" Apr 17 15:22:57.582370 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.582346 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-rgfvc"] Apr 17 15:22:57.598312 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.598288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcmg2\" (UniqueName: \"kubernetes.io/projected/9b7e2636-c164-462c-8307-9fbab5d49855-kube-api-access-rcmg2\") pod \"cert-manager-cainjector-8966b78d4-rgfvc\" (UID: \"9b7e2636-c164-462c-8307-9fbab5d49855\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" Apr 17 15:22:57.598426 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.598326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b7e2636-c164-462c-8307-9fbab5d49855-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-rgfvc\" (UID: \"9b7e2636-c164-462c-8307-9fbab5d49855\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" Apr 17 15:22:57.699415 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.699386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcmg2\" (UniqueName: \"kubernetes.io/projected/9b7e2636-c164-462c-8307-9fbab5d49855-kube-api-access-rcmg2\") pod \"cert-manager-cainjector-8966b78d4-rgfvc\" (UID: \"9b7e2636-c164-462c-8307-9fbab5d49855\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" Apr 17 15:22:57.699563 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.699429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b7e2636-c164-462c-8307-9fbab5d49855-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-rgfvc\" (UID: \"9b7e2636-c164-462c-8307-9fbab5d49855\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" Apr 17 15:22:57.706835 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.706806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b7e2636-c164-462c-8307-9fbab5d49855-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-rgfvc\" (UID: \"9b7e2636-c164-462c-8307-9fbab5d49855\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" Apr 17 15:22:57.706937 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.706865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcmg2\" (UniqueName: \"kubernetes.io/projected/9b7e2636-c164-462c-8307-9fbab5d49855-kube-api-access-rcmg2\") pod \"cert-manager-cainjector-8966b78d4-rgfvc\" (UID: \"9b7e2636-c164-462c-8307-9fbab5d49855\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" Apr 17 15:22:57.885214 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.885128 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" Apr 17 15:22:57.997839 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:57.997806 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-rgfvc"] Apr 17 15:22:58.003680 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:22:58.001885 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b7e2636_c164_462c_8307_9fbab5d49855.slice/crio-f0b02db324c1a8076f24abdf87848565ff091799f0b62d4a078e393263c3831d WatchSource:0}: Error finding container f0b02db324c1a8076f24abdf87848565ff091799f0b62d4a078e393263c3831d: Status 404 returned error can't find the container with id f0b02db324c1a8076f24abdf87848565ff091799f0b62d4a078e393263c3831d Apr 17 15:22:58.005174 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:58.005158 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:22:58.111688 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:22:58.111655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" event={"ID":"9b7e2636-c164-462c-8307-9fbab5d49855","Type":"ContainerStarted","Data":"f0b02db324c1a8076f24abdf87848565ff091799f0b62d4a078e393263c3831d"} Apr 17 15:23:02.125627 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:02.125590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" event={"ID":"9b7e2636-c164-462c-8307-9fbab5d49855","Type":"ContainerStarted","Data":"469bc5eb80472ef0f45286310446ed90794e881095b973764cf7829090d5dac7"} Apr 17 15:23:02.140911 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:02.140859 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-rgfvc" podStartSLOduration=1.196030164 podStartE2EDuration="5.140844447s" podCreationTimestamp="2026-04-17 15:22:57 +0000 UTC" firstStartedPulling="2026-04-17 15:22:58.005285743 +0000 UTC m=+334.461467815" lastFinishedPulling="2026-04-17 15:23:01.950100021 +0000 UTC m=+338.406282098" observedRunningTime="2026-04-17 15:23:02.139984622 +0000 UTC m=+338.596166720" watchObservedRunningTime="2026-04-17 15:23:02.140844447 +0000 UTC m=+338.597026541" Apr 17 15:23:30.882164 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:30.882127 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q"] Apr 17 15:23:30.889207 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:30.889182 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:30.891623 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:30.891597 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 15:23:30.891750 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:30.891621 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-kvnp8\"" Apr 17 15:23:30.891750 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:30.891664 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 15:23:30.891975 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:30.891958 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 15:23:30.892083 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:30.892069 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 15:23:30.895427 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:30.895406 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q"] Apr 17 15:23:31.039860 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.039824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1e209b7e-8e46-481b-8393-68e4d8fdb20e-webhook-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-zwx8q\" (UID: \"1e209b7e-8e46-481b-8393-68e4d8fdb20e\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:31.040031 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.039868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1e209b7e-8e46-481b-8393-68e4d8fdb20e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-zwx8q\" (UID: \"1e209b7e-8e46-481b-8393-68e4d8fdb20e\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:31.040031 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.039896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq4rr\" (UniqueName: \"kubernetes.io/projected/1e209b7e-8e46-481b-8393-68e4d8fdb20e-kube-api-access-kq4rr\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-zwx8q\" (UID: \"1e209b7e-8e46-481b-8393-68e4d8fdb20e\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:31.141258 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.141169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1e209b7e-8e46-481b-8393-68e4d8fdb20e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-zwx8q\" (UID: \"1e209b7e-8e46-481b-8393-68e4d8fdb20e\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:31.141258 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.141215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq4rr\" (UniqueName: \"kubernetes.io/projected/1e209b7e-8e46-481b-8393-68e4d8fdb20e-kube-api-access-kq4rr\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-zwx8q\" (UID: \"1e209b7e-8e46-481b-8393-68e4d8fdb20e\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:31.141258 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.141260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1e209b7e-8e46-481b-8393-68e4d8fdb20e-webhook-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-zwx8q\" (UID: \"1e209b7e-8e46-481b-8393-68e4d8fdb20e\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:31.143870 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.143844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1e209b7e-8e46-481b-8393-68e4d8fdb20e-webhook-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-zwx8q\" (UID: \"1e209b7e-8e46-481b-8393-68e4d8fdb20e\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:31.143979 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.143844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1e209b7e-8e46-481b-8393-68e4d8fdb20e-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-zwx8q\" (UID: \"1e209b7e-8e46-481b-8393-68e4d8fdb20e\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:31.149978 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.149958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq4rr\" (UniqueName: \"kubernetes.io/projected/1e209b7e-8e46-481b-8393-68e4d8fdb20e-kube-api-access-kq4rr\") pod \"opendatahub-operator-controller-manager-9bd7bdf77-zwx8q\" (UID: \"1e209b7e-8e46-481b-8393-68e4d8fdb20e\") " pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:31.199926 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.199901 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:31.323434 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:31.323403 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q"] Apr 17 15:23:31.327523 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:23:31.327491 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e209b7e_8e46_481b_8393_68e4d8fdb20e.slice/crio-57cdb22500fcc4c5b50c9b8a449c9f2a1b7731afe523b1a94b00695e9369ddc4 WatchSource:0}: Error finding container 57cdb22500fcc4c5b50c9b8a449c9f2a1b7731afe523b1a94b00695e9369ddc4: Status 404 returned error can't find the container with id 57cdb22500fcc4c5b50c9b8a449c9f2a1b7731afe523b1a94b00695e9369ddc4 Apr 17 15:23:32.204155 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:32.204118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" event={"ID":"1e209b7e-8e46-481b-8393-68e4d8fdb20e","Type":"ContainerStarted","Data":"57cdb22500fcc4c5b50c9b8a449c9f2a1b7731afe523b1a94b00695e9369ddc4"} Apr 17 15:23:34.282438 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.282406 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w"] Apr 17 15:23:34.285483 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.285459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.287940 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.287910 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 15:23:34.288101 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.288013 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 15:23:34.288101 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.288037 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 15:23:34.288357 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.288322 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 15:23:34.288441 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.288370 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 15:23:34.288441 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.288408 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lf4nl\"" Apr 17 15:23:34.294543 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.294519 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w"] Apr 17 15:23:34.371030 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.370991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57c6941a-9760-4c68-ad71-47b739189881-cert\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.371030 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.371030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/57c6941a-9760-4c68-ad71-47b739189881-manager-config\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.371254 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.371082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnwv8\" (UniqueName: \"kubernetes.io/projected/57c6941a-9760-4c68-ad71-47b739189881-kube-api-access-fnwv8\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.371254 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.371128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/57c6941a-9760-4c68-ad71-47b739189881-metrics-cert\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.472101 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.472040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57c6941a-9760-4c68-ad71-47b739189881-cert\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.472101 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.472108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/57c6941a-9760-4c68-ad71-47b739189881-manager-config\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.472353 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.472133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnwv8\" (UniqueName: \"kubernetes.io/projected/57c6941a-9760-4c68-ad71-47b739189881-kube-api-access-fnwv8\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.472353 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.472163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/57c6941a-9760-4c68-ad71-47b739189881-metrics-cert\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.472999 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.472965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/57c6941a-9760-4c68-ad71-47b739189881-manager-config\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.474563 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.474539 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/57c6941a-9760-4c68-ad71-47b739189881-metrics-cert\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.474671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.474613 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57c6941a-9760-4c68-ad71-47b739189881-cert\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.480999 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.480973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnwv8\" (UniqueName: \"kubernetes.io/projected/57c6941a-9760-4c68-ad71-47b739189881-kube-api-access-fnwv8\") pod \"lws-controller-manager-59bc47b496-68z8w\" (UID: \"57c6941a-9760-4c68-ad71-47b739189881\") " pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.596264 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.596168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:34.729194 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:34.729092 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w"] Apr 17 15:23:34.732588 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:23:34.732553 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57c6941a_9760_4c68_ad71_47b739189881.slice/crio-98486d8fd764af1ee46138b0ce90f41f858ca8d235bb7a36493632f240626f3b WatchSource:0}: Error finding container 98486d8fd764af1ee46138b0ce90f41f858ca8d235bb7a36493632f240626f3b: Status 404 returned error can't find the container with id 98486d8fd764af1ee46138b0ce90f41f858ca8d235bb7a36493632f240626f3b Apr 17 15:23:35.214030 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:35.213996 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" event={"ID":"1e209b7e-8e46-481b-8393-68e4d8fdb20e","Type":"ContainerStarted","Data":"d95553ef8166ac672640cf176295f83664cf05e6514edb8b2c9a186a940b9703"} Apr 17 15:23:35.214219 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:35.214122 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:35.215036 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:35.215016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" event={"ID":"57c6941a-9760-4c68-ad71-47b739189881","Type":"ContainerStarted","Data":"98486d8fd764af1ee46138b0ce90f41f858ca8d235bb7a36493632f240626f3b"} Apr 17 15:23:35.234137 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:35.234089 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" podStartSLOduration=2.261800023 podStartE2EDuration="5.234073461s" podCreationTimestamp="2026-04-17 15:23:30 +0000 UTC" firstStartedPulling="2026-04-17 15:23:31.329253497 +0000 UTC m=+367.785435570" lastFinishedPulling="2026-04-17 15:23:34.301526921 +0000 UTC m=+370.757709008" observedRunningTime="2026-04-17 15:23:35.232118283 +0000 UTC m=+371.688300379" watchObservedRunningTime="2026-04-17 15:23:35.234073461 +0000 UTC m=+371.690255556" Apr 17 15:23:38.224580 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:38.224542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" event={"ID":"57c6941a-9760-4c68-ad71-47b739189881","Type":"ContainerStarted","Data":"3f39f1ab940811b085808b4b267c0e491814b071827b3a9ef176068149e9124c"} Apr 17 15:23:38.224974 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:38.224677 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:23:38.242013 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:38.241962 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" podStartSLOduration=1.675144431 podStartE2EDuration="4.241948117s" podCreationTimestamp="2026-04-17 15:23:34 +0000 UTC" firstStartedPulling="2026-04-17 15:23:34.734496036 +0000 UTC m=+371.190678116" lastFinishedPulling="2026-04-17 15:23:37.301299728 +0000 UTC m=+373.757481802" observedRunningTime="2026-04-17 15:23:38.23989015 +0000 UTC m=+374.696072244" watchObservedRunningTime="2026-04-17 15:23:38.241948117 +0000 UTC m=+374.698130212" Apr 17 15:23:46.220924 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:46.220891 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-9bd7bdf77-zwx8q" Apr 17 15:23:49.230692 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:23:49.230664 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-59bc47b496-68z8w" Apr 17 15:24:29.672157 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.672123 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp"] Apr 17 15:24:29.682825 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.682798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.687120 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.687098 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-bkrrk\"" Apr 17 15:24:29.692507 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.692485 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 15:24:29.692507 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.692487 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 15:24:29.692691 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.692488 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 15:24:29.706496 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.706432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp"] Apr 17 15:24:29.796193 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.796156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c2f2fc86-6d29-4540-840d-05d2dba28fae-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.796364 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.796210 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.796364 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.796250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p8sq\" (UniqueName: \"kubernetes.io/projected/c2f2fc86-6d29-4540-840d-05d2dba28fae-kube-api-access-7p8sq\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.796364 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.796304 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.796466 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.796357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.796466 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.796412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.796466 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.796453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.796565 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.796478 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.796565 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.796526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.897263 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.897229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.897438 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.897273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.897438 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.897400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.897517 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.897444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c2f2fc86-6d29-4540-840d-05d2dba28fae-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.897517 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.897486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.897615 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.897516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7p8sq\" (UniqueName: \"kubernetes.io/projected/c2f2fc86-6d29-4540-840d-05d2dba28fae-kube-api-access-7p8sq\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.897615 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.897546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.897714 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.897646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.897769 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.897714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.897885 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.897857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.898104 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.898080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.898104 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.898097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.898256 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.898180 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.898256 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.898229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c2f2fc86-6d29-4540-840d-05d2dba28fae-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.899757 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.899732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.899879 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.899863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.912647 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.912620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p8sq\" (UniqueName: \"kubernetes.io/projected/c2f2fc86-6d29-4540-840d-05d2dba28fae-kube-api-access-7p8sq\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.914650 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.914625 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c2f2fc86-6d29-4540-840d-05d2dba28fae-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f8m7jp\" (UID: \"c2f2fc86-6d29-4540-840d-05d2dba28fae\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:29.993618 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:29.993577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:30.127822 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:30.127786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp"] Apr 17 15:24:30.130801 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:24:30.130772 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2f2fc86_6d29_4540_840d_05d2dba28fae.slice/crio-6c4e6c93ed715276a6f5ffe5ebeab09a21f5e872f4529c74532cd2d86592ef0c WatchSource:0}: Error finding container 6c4e6c93ed715276a6f5ffe5ebeab09a21f5e872f4529c74532cd2d86592ef0c: Status 404 returned error can't find the container with id 6c4e6c93ed715276a6f5ffe5ebeab09a21f5e872f4529c74532cd2d86592ef0c Apr 17 15:24:30.364933 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:30.364843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" event={"ID":"c2f2fc86-6d29-4540-840d-05d2dba28fae","Type":"ContainerStarted","Data":"6c4e6c93ed715276a6f5ffe5ebeab09a21f5e872f4529c74532cd2d86592ef0c"} Apr 17 15:24:32.664833 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:32.664794 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 15:24:32.665116 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:32.664869 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 15:24:32.665116 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:32.664897 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 15:24:33.375185 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:33.375144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" event={"ID":"c2f2fc86-6d29-4540-840d-05d2dba28fae","Type":"ContainerStarted","Data":"3d091c8892c83d3655e6d3393b98033f4a7f3fb45e4b8620f07c425d4c79e275"} Apr 17 15:24:33.397001 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:33.396927 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" podStartSLOduration=1.86495337 podStartE2EDuration="4.396909877s" podCreationTimestamp="2026-04-17 15:24:29 +0000 UTC" firstStartedPulling="2026-04-17 15:24:30.132599658 +0000 UTC m=+426.588781735" lastFinishedPulling="2026-04-17 15:24:32.66455617 +0000 UTC m=+429.120738242" observedRunningTime="2026-04-17 15:24:33.395093152 +0000 UTC m=+429.851275248" watchObservedRunningTime="2026-04-17 15:24:33.396909877 +0000 UTC m=+429.853091972" Apr 17 15:24:33.994786 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:33.994749 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:33.999513 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:33.999487 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:34.383043 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:34.382945 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:34.384008 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:34.383986 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f8m7jp" Apr 17 15:24:49.487984 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.487907 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m85vt"] Apr 17 15:24:49.491158 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.491133 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m85vt" Apr 17 15:24:49.493548 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.493521 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-zdv4m\"" Apr 17 15:24:49.494643 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.494620 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 15:24:49.494753 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.494678 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 15:24:49.499779 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.499754 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m85vt"] Apr 17 15:24:49.565529 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.565491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2grc\" (UniqueName: \"kubernetes.io/projected/3fdd95c0-5ed4-4024-b0e6-40dfc8589e65-kube-api-access-l2grc\") pod \"kuadrant-operator-catalog-m85vt\" (UID: \"3fdd95c0-5ed4-4024-b0e6-40dfc8589e65\") " pod="kuadrant-system/kuadrant-operator-catalog-m85vt" Apr 17 15:24:49.665953 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.665919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2grc\" (UniqueName: \"kubernetes.io/projected/3fdd95c0-5ed4-4024-b0e6-40dfc8589e65-kube-api-access-l2grc\") pod \"kuadrant-operator-catalog-m85vt\" (UID: \"3fdd95c0-5ed4-4024-b0e6-40dfc8589e65\") " pod="kuadrant-system/kuadrant-operator-catalog-m85vt" Apr 17 15:24:49.673678 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.673649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2grc\" (UniqueName: \"kubernetes.io/projected/3fdd95c0-5ed4-4024-b0e6-40dfc8589e65-kube-api-access-l2grc\") pod \"kuadrant-operator-catalog-m85vt\" (UID: \"3fdd95c0-5ed4-4024-b0e6-40dfc8589e65\") " pod="kuadrant-system/kuadrant-operator-catalog-m85vt" Apr 17 15:24:49.802012 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.801915 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m85vt" Apr 17 15:24:49.859387 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.855779 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m85vt"] Apr 17 15:24:49.936495 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:49.936460 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m85vt"] Apr 17 15:24:49.939991 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:24:49.939955 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fdd95c0_5ed4_4024_b0e6_40dfc8589e65.slice/crio-a9337bcd2732e39179cb5782dbde0ef87e776fc5164bf7a710848f6de34b4990 WatchSource:0}: Error finding container a9337bcd2732e39179cb5782dbde0ef87e776fc5164bf7a710848f6de34b4990: Status 404 returned error can't find the container with id a9337bcd2732e39179cb5782dbde0ef87e776fc5164bf7a710848f6de34b4990 Apr 17 15:24:50.060788 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:50.060706 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-sqmgh"] Apr 17 15:24:50.065183 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:50.065165 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" Apr 17 15:24:50.071176 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:50.071152 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-sqmgh"] Apr 17 15:24:50.169549 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:50.169511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2fl7\" (UniqueName: \"kubernetes.io/projected/078d1eb0-c79c-4ff5-a0db-61196782bdce-kube-api-access-r2fl7\") pod \"kuadrant-operator-catalog-sqmgh\" (UID: \"078d1eb0-c79c-4ff5-a0db-61196782bdce\") " pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" Apr 17 15:24:50.270723 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:50.270690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fl7\" (UniqueName: \"kubernetes.io/projected/078d1eb0-c79c-4ff5-a0db-61196782bdce-kube-api-access-r2fl7\") pod \"kuadrant-operator-catalog-sqmgh\" (UID: \"078d1eb0-c79c-4ff5-a0db-61196782bdce\") " pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" Apr 17 15:24:50.278215 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:50.278181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2fl7\" (UniqueName: \"kubernetes.io/projected/078d1eb0-c79c-4ff5-a0db-61196782bdce-kube-api-access-r2fl7\") pod \"kuadrant-operator-catalog-sqmgh\" (UID: \"078d1eb0-c79c-4ff5-a0db-61196782bdce\") " pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" Apr 17 15:24:50.375435 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:50.375353 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" Apr 17 15:24:50.429346 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:50.429227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m85vt" event={"ID":"3fdd95c0-5ed4-4024-b0e6-40dfc8589e65","Type":"ContainerStarted","Data":"a9337bcd2732e39179cb5782dbde0ef87e776fc5164bf7a710848f6de34b4990"} Apr 17 15:24:50.492197 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:50.492161 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-sqmgh"] Apr 17 15:24:50.495709 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:24:50.495680 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod078d1eb0_c79c_4ff5_a0db_61196782bdce.slice/crio-a7de3365951d50e2adf8634e1e7977d24df019d24a4fe5f99baf5ed06bdbd042 WatchSource:0}: Error finding container a7de3365951d50e2adf8634e1e7977d24df019d24a4fe5f99baf5ed06bdbd042: Status 404 returned error can't find the container with id a7de3365951d50e2adf8634e1e7977d24df019d24a4fe5f99baf5ed06bdbd042 Apr 17 15:24:51.434066 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:51.434008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" event={"ID":"078d1eb0-c79c-4ff5-a0db-61196782bdce","Type":"ContainerStarted","Data":"a7de3365951d50e2adf8634e1e7977d24df019d24a4fe5f99baf5ed06bdbd042"} Apr 17 15:24:52.438433 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:52.438340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m85vt" event={"ID":"3fdd95c0-5ed4-4024-b0e6-40dfc8589e65","Type":"ContainerStarted","Data":"8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3"} Apr 17 15:24:52.438878 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:52.438421 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-m85vt" podUID="3fdd95c0-5ed4-4024-b0e6-40dfc8589e65" containerName="registry-server" containerID="cri-o://8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3" gracePeriod=2 Apr 17 15:24:52.439709 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:52.439686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" event={"ID":"078d1eb0-c79c-4ff5-a0db-61196782bdce","Type":"ContainerStarted","Data":"695fecfb126fc84df8301496a68ff84016ac7e035a7cf3dbf32ebf566c5381b2"} Apr 17 15:24:52.454484 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:52.454432 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-m85vt" podStartSLOduration=1.207021427 podStartE2EDuration="3.454414739s" podCreationTimestamp="2026-04-17 15:24:49 +0000 UTC" firstStartedPulling="2026-04-17 15:24:49.941578174 +0000 UTC m=+446.397760248" lastFinishedPulling="2026-04-17 15:24:52.188971482 +0000 UTC m=+448.645153560" observedRunningTime="2026-04-17 15:24:52.453859542 +0000 UTC m=+448.910041650" watchObservedRunningTime="2026-04-17 15:24:52.454414739 +0000 UTC m=+448.910596835" Apr 17 15:24:52.467836 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:52.467795 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" podStartSLOduration=0.708098209 podStartE2EDuration="2.467782165s" podCreationTimestamp="2026-04-17 15:24:50 +0000 UTC" firstStartedPulling="2026-04-17 15:24:50.497041317 +0000 UTC m=+446.953223390" lastFinishedPulling="2026-04-17 15:24:52.256725259 +0000 UTC m=+448.712907346" observedRunningTime="2026-04-17 15:24:52.466675468 +0000 UTC m=+448.922857562" watchObservedRunningTime="2026-04-17 15:24:52.467782165 +0000 UTC m=+448.923964260" Apr 17 15:24:52.672692 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:52.672663 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m85vt" Apr 17 15:24:52.793120 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:52.793086 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2grc\" (UniqueName: \"kubernetes.io/projected/3fdd95c0-5ed4-4024-b0e6-40dfc8589e65-kube-api-access-l2grc\") pod \"3fdd95c0-5ed4-4024-b0e6-40dfc8589e65\" (UID: \"3fdd95c0-5ed4-4024-b0e6-40dfc8589e65\") " Apr 17 15:24:52.795372 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:52.795337 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fdd95c0-5ed4-4024-b0e6-40dfc8589e65-kube-api-access-l2grc" (OuterVolumeSpecName: "kube-api-access-l2grc") pod "3fdd95c0-5ed4-4024-b0e6-40dfc8589e65" (UID: "3fdd95c0-5ed4-4024-b0e6-40dfc8589e65"). InnerVolumeSpecName "kube-api-access-l2grc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:24:52.894510 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:52.894469 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2grc\" (UniqueName: \"kubernetes.io/projected/3fdd95c0-5ed4-4024-b0e6-40dfc8589e65-kube-api-access-l2grc\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:24:53.444005 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:53.443969 2576 generic.go:358] "Generic (PLEG): container finished" podID="3fdd95c0-5ed4-4024-b0e6-40dfc8589e65" containerID="8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3" exitCode=0 Apr 17 15:24:53.444457 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:53.444035 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m85vt" Apr 17 15:24:53.444457 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:53.444063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m85vt" event={"ID":"3fdd95c0-5ed4-4024-b0e6-40dfc8589e65","Type":"ContainerDied","Data":"8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3"} Apr 17 15:24:53.444457 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:53.444099 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m85vt" event={"ID":"3fdd95c0-5ed4-4024-b0e6-40dfc8589e65","Type":"ContainerDied","Data":"a9337bcd2732e39179cb5782dbde0ef87e776fc5164bf7a710848f6de34b4990"} Apr 17 15:24:53.444457 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:53.444117 2576 scope.go:117] "RemoveContainer" containerID="8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3" Apr 17 15:24:53.452420 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:53.452404 2576 scope.go:117] "RemoveContainer" containerID="8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3" Apr 17 15:24:53.452639 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:24:53.452619 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3\": container with ID starting with 8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3 not found: ID does not exist" containerID="8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3" Apr 17 15:24:53.452682 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:53.452650 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3"} err="failed to get container status \"8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3\": rpc error: code = NotFound desc = could not find container \"8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3\": container with ID starting with 8b4b74a46b117c4c7548d7b48a96cbed24de60453190bb572689a63ac918cfc3 not found: ID does not exist" Apr 17 15:24:53.463332 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:53.463265 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m85vt"] Apr 17 15:24:53.466214 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:53.466194 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m85vt"] Apr 17 15:24:54.139704 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:24:54.139669 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fdd95c0-5ed4-4024-b0e6-40dfc8589e65" path="/var/lib/kubelet/pods/3fdd95c0-5ed4-4024-b0e6-40dfc8589e65/volumes" Apr 17 15:25:00.376072 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:00.376009 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" Apr 17 15:25:00.376486 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:00.376089 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" Apr 17 15:25:00.397606 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:00.397576 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" Apr 17 15:25:00.486509 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:00.486480 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-sqmgh" Apr 17 15:25:17.702305 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.702266 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5"] Apr 17 15:25:17.702780 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.702548 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fdd95c0-5ed4-4024-b0e6-40dfc8589e65" containerName="registry-server" Apr 17 15:25:17.702780 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.702561 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fdd95c0-5ed4-4024-b0e6-40dfc8589e65" containerName="registry-server" Apr 17 15:25:17.702780 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.702626 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fdd95c0-5ed4-4024-b0e6-40dfc8589e65" containerName="registry-server" Apr 17 15:25:17.705377 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.705360 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5" Apr 17 15:25:17.707869 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.707845 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-rgb92\"" Apr 17 15:25:17.707969 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.707848 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 15:25:17.716113 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.716094 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5"] Apr 17 15:25:17.780813 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.780779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swrzp\" (UniqueName: \"kubernetes.io/projected/f9137b3e-b1b6-4137-a433-6fce03d62db1-kube-api-access-swrzp\") pod \"dns-operator-controller-manager-648d5c98bc-hqwc5\" (UID: \"f9137b3e-b1b6-4137-a433-6fce03d62db1\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5" Apr 17 15:25:17.882120 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.882081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swrzp\" (UniqueName: \"kubernetes.io/projected/f9137b3e-b1b6-4137-a433-6fce03d62db1-kube-api-access-swrzp\") pod \"dns-operator-controller-manager-648d5c98bc-hqwc5\" (UID: \"f9137b3e-b1b6-4137-a433-6fce03d62db1\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5" Apr 17 15:25:17.889970 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:17.889938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swrzp\" (UniqueName: \"kubernetes.io/projected/f9137b3e-b1b6-4137-a433-6fce03d62db1-kube-api-access-swrzp\") pod \"dns-operator-controller-manager-648d5c98bc-hqwc5\" (UID: \"f9137b3e-b1b6-4137-a433-6fce03d62db1\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5" Apr 17 15:25:18.015590 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:18.015553 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5" Apr 17 15:25:18.139404 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:18.139358 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5"] Apr 17 15:25:18.142131 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:25:18.142103 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9137b3e_b1b6_4137_a433_6fce03d62db1.slice/crio-023342c66a0fba10815da498427619398de737542fcfcae539636a1c0e241dd9 WatchSource:0}: Error finding container 023342c66a0fba10815da498427619398de737542fcfcae539636a1c0e241dd9: Status 404 returned error can't find the container with id 023342c66a0fba10815da498427619398de737542fcfcae539636a1c0e241dd9 Apr 17 15:25:18.519500 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:18.519463 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5" event={"ID":"f9137b3e-b1b6-4137-a433-6fce03d62db1","Type":"ContainerStarted","Data":"023342c66a0fba10815da498427619398de737542fcfcae539636a1c0e241dd9"} Apr 17 15:25:21.187250 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.187169 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc"] Apr 17 15:25:21.190416 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.190399 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" Apr 17 15:25:21.192698 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.192676 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-tqqhs\"" Apr 17 15:25:21.200612 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.200589 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc"] Apr 17 15:25:21.208179 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.208144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq6c6\" (UniqueName: \"kubernetes.io/projected/68c72ba3-4b4a-40ff-85d9-4c34cc37c052-kube-api-access-pq6c6\") pod \"limitador-operator-controller-manager-85c4996f8c-rztlc\" (UID: \"68c72ba3-4b4a-40ff-85d9-4c34cc37c052\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" Apr 17 15:25:21.309237 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.309204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pq6c6\" (UniqueName: \"kubernetes.io/projected/68c72ba3-4b4a-40ff-85d9-4c34cc37c052-kube-api-access-pq6c6\") pod \"limitador-operator-controller-manager-85c4996f8c-rztlc\" (UID: \"68c72ba3-4b4a-40ff-85d9-4c34cc37c052\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" Apr 17 15:25:21.316958 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.316933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq6c6\" (UniqueName: \"kubernetes.io/projected/68c72ba3-4b4a-40ff-85d9-4c34cc37c052-kube-api-access-pq6c6\") pod \"limitador-operator-controller-manager-85c4996f8c-rztlc\" (UID: \"68c72ba3-4b4a-40ff-85d9-4c34cc37c052\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" Apr 17 15:25:21.500235 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.500196 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" Apr 17 15:25:21.530937 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.530902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5" event={"ID":"f9137b3e-b1b6-4137-a433-6fce03d62db1","Type":"ContainerStarted","Data":"021af0e7c9d3716dec9b62a435e718dd823b06997e9485062f021efd68767934"} Apr 17 15:25:21.531101 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.531064 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5" Apr 17 15:25:21.550567 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.549963 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5" podStartSLOduration=1.980625758 podStartE2EDuration="4.549943015s" podCreationTimestamp="2026-04-17 15:25:17 +0000 UTC" firstStartedPulling="2026-04-17 15:25:18.14383623 +0000 UTC m=+474.600018306" lastFinishedPulling="2026-04-17 15:25:20.713153485 +0000 UTC m=+477.169335563" observedRunningTime="2026-04-17 15:25:21.548179573 +0000 UTC m=+478.004361668" watchObservedRunningTime="2026-04-17 15:25:21.549943015 +0000 UTC m=+478.006125111" Apr 17 15:25:21.637061 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:21.637021 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc"] Apr 17 15:25:21.640706 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:25:21.640669 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68c72ba3_4b4a_40ff_85d9_4c34cc37c052.slice/crio-3a33e2a853c15f793a47f53597f38597dc3e5376c0b586c751d76d7b0cabcde1 WatchSource:0}: Error finding container 3a33e2a853c15f793a47f53597f38597dc3e5376c0b586c751d76d7b0cabcde1: Status 404 returned error can't find the container with id 3a33e2a853c15f793a47f53597f38597dc3e5376c0b586c751d76d7b0cabcde1 Apr 17 15:25:22.534635 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:22.534598 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" event={"ID":"68c72ba3-4b4a-40ff-85d9-4c34cc37c052","Type":"ContainerStarted","Data":"3a33e2a853c15f793a47f53597f38597dc3e5376c0b586c751d76d7b0cabcde1"} Apr 17 15:25:23.539385 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:23.539349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" event={"ID":"68c72ba3-4b4a-40ff-85d9-4c34cc37c052","Type":"ContainerStarted","Data":"84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d"} Apr 17 15:25:23.539827 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:23.539473 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" Apr 17 15:25:23.555691 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:23.555641 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" podStartSLOduration=0.7346567 podStartE2EDuration="2.555626427s" podCreationTimestamp="2026-04-17 15:25:21 +0000 UTC" firstStartedPulling="2026-04-17 15:25:21.643069079 +0000 UTC m=+478.099251155" lastFinishedPulling="2026-04-17 15:25:23.464038809 +0000 UTC m=+479.920220882" observedRunningTime="2026-04-17 15:25:23.55411385 +0000 UTC m=+480.010295945" watchObservedRunningTime="2026-04-17 15:25:23.555626427 +0000 UTC m=+480.011808545" Apr 17 15:25:26.770775 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.770741 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6974f5cc54-56jgv"] Apr 17 15:25:26.775188 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.775158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.777660 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.777636 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 15:25:26.777790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.777717 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-w8shx\"" Apr 17 15:25:26.777790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.777749 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 15:25:26.778571 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.778554 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 15:25:26.778671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.778653 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 15:25:26.778727 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.778688 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 15:25:26.782827 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.782808 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 15:25:26.789215 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.789192 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6974f5cc54-56jgv"] Apr 17 15:25:26.855736 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.855702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cde72007-672b-4a83-9ebe-149f70e6e122-console-serving-cert\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.855904 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.855747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-service-ca\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.855904 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.855789 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-oauth-serving-cert\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.855904 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.855853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxhx7\" (UniqueName: \"kubernetes.io/projected/cde72007-672b-4a83-9ebe-149f70e6e122-kube-api-access-zxhx7\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.855904 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.855892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cde72007-672b-4a83-9ebe-149f70e6e122-console-oauth-config\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.856084 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.855917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-trusted-ca-bundle\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.856084 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.855997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-console-config\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.957349 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.957309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-console-config\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.957507 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.957384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cde72007-672b-4a83-9ebe-149f70e6e122-console-serving-cert\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.957507 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.957412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-service-ca\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.957507 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.957448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-oauth-serving-cert\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.957507 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.957483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxhx7\" (UniqueName: \"kubernetes.io/projected/cde72007-672b-4a83-9ebe-149f70e6e122-kube-api-access-zxhx7\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.957718 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.957515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cde72007-672b-4a83-9ebe-149f70e6e122-console-oauth-config\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.957718 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.957543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-trusted-ca-bundle\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.958319 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.958289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-oauth-serving-cert\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.958471 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.958304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-service-ca\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.958536 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.958390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-trusted-ca-bundle\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.958726 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.958704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cde72007-672b-4a83-9ebe-149f70e6e122-console-config\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.960036 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.960008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cde72007-672b-4a83-9ebe-149f70e6e122-console-serving-cert\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.960158 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.960108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cde72007-672b-4a83-9ebe-149f70e6e122-console-oauth-config\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:26.965487 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:26.965436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxhx7\" (UniqueName: \"kubernetes.io/projected/cde72007-672b-4a83-9ebe-149f70e6e122-kube-api-access-zxhx7\") pod \"console-6974f5cc54-56jgv\" (UID: \"cde72007-672b-4a83-9ebe-149f70e6e122\") " pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:27.085225 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:27.085142 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:27.206381 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:27.206354 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6974f5cc54-56jgv"] Apr 17 15:25:27.208687 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:25:27.208654 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcde72007_672b_4a83_9ebe_149f70e6e122.slice/crio-134ee183bc3369eb99defa8381321929cae353dec1b2d14939a42f2a96582abe WatchSource:0}: Error finding container 134ee183bc3369eb99defa8381321929cae353dec1b2d14939a42f2a96582abe: Status 404 returned error can't find the container with id 134ee183bc3369eb99defa8381321929cae353dec1b2d14939a42f2a96582abe Apr 17 15:25:27.554271 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:27.554234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6974f5cc54-56jgv" event={"ID":"cde72007-672b-4a83-9ebe-149f70e6e122","Type":"ContainerStarted","Data":"5dc1ba94e258237f60b0ae4f36d19bb2fecf7303f660b15632f4e42c0d1f670f"} Apr 17 15:25:27.554271 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:27.554269 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6974f5cc54-56jgv" event={"ID":"cde72007-672b-4a83-9ebe-149f70e6e122","Type":"ContainerStarted","Data":"134ee183bc3369eb99defa8381321929cae353dec1b2d14939a42f2a96582abe"} Apr 17 15:25:27.574548 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:27.574501 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6974f5cc54-56jgv" podStartSLOduration=1.574484363 podStartE2EDuration="1.574484363s" podCreationTimestamp="2026-04-17 15:25:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:25:27.572881167 +0000 UTC m=+484.029063287" watchObservedRunningTime="2026-04-17 15:25:27.574484363 +0000 UTC m=+484.030666459" Apr 17 15:25:29.781959 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:29.781923 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq"] Apr 17 15:25:29.785426 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:29.785408 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:29.788140 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:29.788124 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6gthh\"" Apr 17 15:25:29.797628 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:29.797605 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq"] Apr 17 15:25:29.882134 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:29.882102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/39a60d35-8aeb-4b49-a3af-3650913659c0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" (UID: \"39a60d35-8aeb-4b49-a3af-3650913659c0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:29.882301 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:29.882182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hlcs\" (UniqueName: \"kubernetes.io/projected/39a60d35-8aeb-4b49-a3af-3650913659c0-kube-api-access-6hlcs\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" (UID: \"39a60d35-8aeb-4b49-a3af-3650913659c0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:29.983415 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:29.983378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hlcs\" (UniqueName: \"kubernetes.io/projected/39a60d35-8aeb-4b49-a3af-3650913659c0-kube-api-access-6hlcs\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" (UID: \"39a60d35-8aeb-4b49-a3af-3650913659c0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:29.983585 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:29.983428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/39a60d35-8aeb-4b49-a3af-3650913659c0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" (UID: \"39a60d35-8aeb-4b49-a3af-3650913659c0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:29.983792 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:29.983774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/39a60d35-8aeb-4b49-a3af-3650913659c0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" (UID: \"39a60d35-8aeb-4b49-a3af-3650913659c0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:29.991445 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:29.991418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hlcs\" (UniqueName: \"kubernetes.io/projected/39a60d35-8aeb-4b49-a3af-3650913659c0-kube-api-access-6hlcs\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" (UID: \"39a60d35-8aeb-4b49-a3af-3650913659c0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:30.095441 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:30.095351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:30.216036 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:30.216013 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq"] Apr 17 15:25:30.219370 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:25:30.219342 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39a60d35_8aeb_4b49_a3af_3650913659c0.slice/crio-766ae7cfa5dd1721199bf20ebdf0298f2e42957063a60733a0d4ebe09510359a WatchSource:0}: Error finding container 766ae7cfa5dd1721199bf20ebdf0298f2e42957063a60733a0d4ebe09510359a: Status 404 returned error can't find the container with id 766ae7cfa5dd1721199bf20ebdf0298f2e42957063a60733a0d4ebe09510359a Apr 17 15:25:30.564725 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:30.564692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" event={"ID":"39a60d35-8aeb-4b49-a3af-3650913659c0","Type":"ContainerStarted","Data":"766ae7cfa5dd1721199bf20ebdf0298f2e42957063a60733a0d4ebe09510359a"} Apr 17 15:25:32.536740 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:32.536705 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-hqwc5" Apr 17 15:25:34.546574 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:34.546542 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" Apr 17 15:25:36.583537 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:36.583500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" event={"ID":"39a60d35-8aeb-4b49-a3af-3650913659c0","Type":"ContainerStarted","Data":"4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519"} Apr 17 15:25:36.583906 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:36.583722 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:36.601729 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:36.601686 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" podStartSLOduration=2.270542952 podStartE2EDuration="7.601673585s" podCreationTimestamp="2026-04-17 15:25:29 +0000 UTC" firstStartedPulling="2026-04-17 15:25:30.221669321 +0000 UTC m=+486.677851393" lastFinishedPulling="2026-04-17 15:25:35.552799953 +0000 UTC m=+492.008982026" observedRunningTime="2026-04-17 15:25:36.600328385 +0000 UTC m=+493.056510481" watchObservedRunningTime="2026-04-17 15:25:36.601673585 +0000 UTC m=+493.057855671" Apr 17 15:25:37.085680 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:37.085639 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:37.085680 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:37.085680 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:37.090335 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:37.090313 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:37.590278 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:37.590248 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6974f5cc54-56jgv" Apr 17 15:25:47.588666 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:47.588631 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:49.190684 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.190646 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq"] Apr 17 15:25:49.191177 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.190916 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" containerName="manager" containerID="cri-o://4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519" gracePeriod=2 Apr 17 15:25:49.198497 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.198465 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq"] Apr 17 15:25:49.218329 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.218303 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44"] Apr 17 15:25:49.218585 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.218573 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" containerName="manager" Apr 17 15:25:49.218627 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.218587 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" containerName="manager" Apr 17 15:25:49.218673 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.218664 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" containerName="manager" Apr 17 15:25:49.221644 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.221626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:25:49.224012 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.223988 2576 status_manager.go:895] "Failed to get status for pod" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.235067 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.235025 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44"] Apr 17 15:25:49.246958 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.242446 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc"] Apr 17 15:25:49.246958 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.242804 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" containerName="manager" containerID="cri-o://84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d" gracePeriod=2 Apr 17 15:25:49.250732 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.250698 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc"] Apr 17 15:25:49.256163 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.256126 2576 status_manager.go:895] "Failed to get status for pod" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.258339 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.258309 2576 status_manager.go:895] "Failed to get status for pod" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" err="pods \"limitador-operator-controller-manager-85c4996f8c-rztlc\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.274827 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.274793 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6"] Apr 17 15:25:49.275165 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.275150 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" containerName="manager" Apr 17 15:25:49.275215 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.275167 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" containerName="manager" Apr 17 15:25:49.275249 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.275221 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" containerName="manager" Apr 17 15:25:49.278177 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.278153 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6" Apr 17 15:25:49.280534 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.280501 2576 status_manager.go:895] "Failed to get status for pod" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.282701 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.282667 2576 status_manager.go:895] "Failed to get status for pod" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" err="pods \"limitador-operator-controller-manager-85c4996f8c-rztlc\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.288224 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.288197 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6"] Apr 17 15:25:49.351012 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.350975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thp44\" (UID: \"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:25:49.351180 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.351033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prxwk\" (UniqueName: \"kubernetes.io/projected/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-kube-api-access-prxwk\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thp44\" (UID: \"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:25:49.351232 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.351174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxsgh\" (UniqueName: \"kubernetes.io/projected/277c2299-36eb-43a7-a599-3a6d75557b5f-kube-api-access-dxsgh\") pod \"limitador-operator-controller-manager-85c4996f8c-2dvt6\" (UID: \"277c2299-36eb-43a7-a599-3a6d75557b5f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6" Apr 17 15:25:49.452654 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.452573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxsgh\" (UniqueName: \"kubernetes.io/projected/277c2299-36eb-43a7-a599-3a6d75557b5f-kube-api-access-dxsgh\") pod \"limitador-operator-controller-manager-85c4996f8c-2dvt6\" (UID: \"277c2299-36eb-43a7-a599-3a6d75557b5f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6" Apr 17 15:25:49.452818 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.452654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thp44\" (UID: \"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:25:49.452818 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.452709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prxwk\" (UniqueName: \"kubernetes.io/projected/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-kube-api-access-prxwk\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thp44\" (UID: \"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:25:49.453110 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.453086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thp44\" (UID: \"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:25:49.469146 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.469122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxsgh\" (UniqueName: \"kubernetes.io/projected/277c2299-36eb-43a7-a599-3a6d75557b5f-kube-api-access-dxsgh\") pod \"limitador-operator-controller-manager-85c4996f8c-2dvt6\" (UID: \"277c2299-36eb-43a7-a599-3a6d75557b5f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6" Apr 17 15:25:49.469578 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.469561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prxwk\" (UniqueName: \"kubernetes.io/projected/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-kube-api-access-prxwk\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-thp44\" (UID: \"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:25:49.480671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.480650 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:49.483174 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.483149 2576 status_manager.go:895] "Failed to get status for pod" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" err="pods \"limitador-operator-controller-manager-85c4996f8c-rztlc\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.483595 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.483580 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" Apr 17 15:25:49.485711 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.485689 2576 status_manager.go:895] "Failed to get status for pod" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.488009 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.487990 2576 status_manager.go:895] "Failed to get status for pod" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.490243 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.490225 2576 status_manager.go:895] "Failed to get status for pod" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" err="pods \"limitador-operator-controller-manager-85c4996f8c-rztlc\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.554006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.553969 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/39a60d35-8aeb-4b49-a3af-3650913659c0-extensions-socket-volume\") pod \"39a60d35-8aeb-4b49-a3af-3650913659c0\" (UID: \"39a60d35-8aeb-4b49-a3af-3650913659c0\") " Apr 17 15:25:49.554169 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.554103 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hlcs\" (UniqueName: \"kubernetes.io/projected/39a60d35-8aeb-4b49-a3af-3650913659c0-kube-api-access-6hlcs\") pod \"39a60d35-8aeb-4b49-a3af-3650913659c0\" (UID: \"39a60d35-8aeb-4b49-a3af-3650913659c0\") " Apr 17 15:25:49.554462 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.554436 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a60d35-8aeb-4b49-a3af-3650913659c0-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "39a60d35-8aeb-4b49-a3af-3650913659c0" (UID: "39a60d35-8aeb-4b49-a3af-3650913659c0"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:25:49.556091 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.556065 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a60d35-8aeb-4b49-a3af-3650913659c0-kube-api-access-6hlcs" (OuterVolumeSpecName: "kube-api-access-6hlcs") pod "39a60d35-8aeb-4b49-a3af-3650913659c0" (UID: "39a60d35-8aeb-4b49-a3af-3650913659c0"). InnerVolumeSpecName "kube-api-access-6hlcs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:25:49.618440 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.618401 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:25:49.624688 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.624660 2576 generic.go:358] "Generic (PLEG): container finished" podID="39a60d35-8aeb-4b49-a3af-3650913659c0" containerID="4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519" exitCode=0 Apr 17 15:25:49.624805 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.624723 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" Apr 17 15:25:49.624805 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.624745 2576 scope.go:117] "RemoveContainer" containerID="4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519" Apr 17 15:25:49.626029 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.626002 2576 generic.go:358] "Generic (PLEG): container finished" podID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" containerID="84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d" exitCode=0 Apr 17 15:25:49.626168 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.626067 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" Apr 17 15:25:49.627228 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.627203 2576 status_manager.go:895] "Failed to get status for pod" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.629315 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.629285 2576 status_manager.go:895] "Failed to get status for pod" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" err="pods \"limitador-operator-controller-manager-85c4996f8c-rztlc\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.629876 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.629856 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6" Apr 17 15:25:49.631334 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.631306 2576 status_manager.go:895] "Failed to get status for pod" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.633129 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.633023 2576 status_manager.go:895] "Failed to get status for pod" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" err="pods \"limitador-operator-controller-manager-85c4996f8c-rztlc\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.633343 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.633212 2576 scope.go:117] "RemoveContainer" containerID="4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519" Apr 17 15:25:49.633544 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:25:49.633518 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519\": container with ID starting with 4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519 not found: ID does not exist" containerID="4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519" Apr 17 15:25:49.633672 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.633647 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519"} err="failed to get container status \"4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519\": rpc error: code = NotFound desc = could not find container \"4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519\": container with ID starting with 4819eafb9414a7e75ffb770e5d1d1acfffdf6a25be3fa0a5840e97cd14bcc519 not found: ID does not exist" Apr 17 15:25:49.633744 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.633706 2576 scope.go:117] "RemoveContainer" containerID="84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d" Apr 17 15:25:49.635824 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.635803 2576 status_manager.go:895] "Failed to get status for pod" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" err="pods \"limitador-operator-controller-manager-85c4996f8c-rztlc\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.637704 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.637681 2576 status_manager.go:895] "Failed to get status for pod" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.641324 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.641301 2576 scope.go:117] "RemoveContainer" containerID="84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d" Apr 17 15:25:49.641636 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:25:49.641617 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d\": container with ID starting with 84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d not found: ID does not exist" containerID="84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d" Apr 17 15:25:49.641702 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.641645 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d"} err="failed to get container status \"84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d\": rpc error: code = NotFound desc = could not find container \"84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d\": container with ID starting with 84124c044ad7206450bd578054cf1c976c44490911cab34dc65c167a1748d65d not found: ID does not exist" Apr 17 15:25:49.655006 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.654970 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq6c6\" (UniqueName: \"kubernetes.io/projected/68c72ba3-4b4a-40ff-85d9-4c34cc37c052-kube-api-access-pq6c6\") pod \"68c72ba3-4b4a-40ff-85d9-4c34cc37c052\" (UID: \"68c72ba3-4b4a-40ff-85d9-4c34cc37c052\") " Apr 17 15:25:49.655213 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.655177 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hlcs\" (UniqueName: \"kubernetes.io/projected/39a60d35-8aeb-4b49-a3af-3650913659c0-kube-api-access-6hlcs\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:25:49.655213 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.655196 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/39a60d35-8aeb-4b49-a3af-3650913659c0-extensions-socket-volume\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:25:49.658294 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.658258 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68c72ba3-4b4a-40ff-85d9-4c34cc37c052-kube-api-access-pq6c6" (OuterVolumeSpecName: "kube-api-access-pq6c6") pod "68c72ba3-4b4a-40ff-85d9-4c34cc37c052" (UID: "68c72ba3-4b4a-40ff-85d9-4c34cc37c052"). InnerVolumeSpecName "kube-api-access-pq6c6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:25:49.749626 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.749584 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44"] Apr 17 15:25:49.752908 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:25:49.752874 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e7c565d_6deb_46a1_8a0a_b1a610bcfd0e.slice/crio-616c776afb77978b2a1b4bc3c3e9f843ba9a3af42a44837f39b0134db79cda88 WatchSource:0}: Error finding container 616c776afb77978b2a1b4bc3c3e9f843ba9a3af42a44837f39b0134db79cda88: Status 404 returned error can't find the container with id 616c776afb77978b2a1b4bc3c3e9f843ba9a3af42a44837f39b0134db79cda88 Apr 17 15:25:49.757941 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.757920 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pq6c6\" (UniqueName: \"kubernetes.io/projected/68c72ba3-4b4a-40ff-85d9-4c34cc37c052-kube-api-access-pq6c6\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:25:49.770023 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.770002 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6"] Apr 17 15:25:49.773138 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:25:49.773115 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod277c2299_36eb_43a7_a599_3a6d75557b5f.slice/crio-71ba9ea97210d3b00db7af33fb740245c073e4ae05b3dd533a1824883187b333 WatchSource:0}: Error finding container 71ba9ea97210d3b00db7af33fb740245c073e4ae05b3dd533a1824883187b333: Status 404 returned error can't find the container with id 71ba9ea97210d3b00db7af33fb740245c073e4ae05b3dd533a1824883187b333 Apr 17 15:25:49.936711 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.936671 2576 status_manager.go:895] "Failed to get status for pod" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rztlc" err="pods \"limitador-operator-controller-manager-85c4996f8c-rztlc\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:49.938562 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:49.938537 2576 status_manager.go:895] "Failed to get status for pod" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-w68bq" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-w68bq\" is forbidden: User \"system:node:ip-10-0-133-75.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-75.ec2.internal' and this object" Apr 17 15:25:50.139765 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:50.139685 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a60d35-8aeb-4b49-a3af-3650913659c0" path="/var/lib/kubelet/pods/39a60d35-8aeb-4b49-a3af-3650913659c0/volumes" Apr 17 15:25:50.139992 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:50.139979 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68c72ba3-4b4a-40ff-85d9-4c34cc37c052" path="/var/lib/kubelet/pods/68c72ba3-4b4a-40ff-85d9-4c34cc37c052/volumes" Apr 17 15:25:50.632560 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:50.632500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" event={"ID":"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e","Type":"ContainerStarted","Data":"ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9"} Apr 17 15:25:50.632560 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:50.632550 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" event={"ID":"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e","Type":"ContainerStarted","Data":"616c776afb77978b2a1b4bc3c3e9f843ba9a3af42a44837f39b0134db79cda88"} Apr 17 15:25:50.633085 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:50.632597 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:25:50.634637 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:50.634615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6" event={"ID":"277c2299-36eb-43a7-a599-3a6d75557b5f","Type":"ContainerStarted","Data":"2d62f867d744d0caf14e8fd67d3f201215061e16f035edef557928eb0424a3da"} Apr 17 15:25:50.634742 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:50.634640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6" event={"ID":"277c2299-36eb-43a7-a599-3a6d75557b5f","Type":"ContainerStarted","Data":"71ba9ea97210d3b00db7af33fb740245c073e4ae05b3dd533a1824883187b333"} Apr 17 15:25:50.634780 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:50.634763 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6" Apr 17 15:25:50.654267 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:50.654217 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" podStartSLOduration=1.654200079 podStartE2EDuration="1.654200079s" podCreationTimestamp="2026-04-17 15:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:25:50.65190514 +0000 UTC m=+507.108087234" watchObservedRunningTime="2026-04-17 15:25:50.654200079 +0000 UTC m=+507.110382173" Apr 17 15:25:50.682499 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:25:50.682439 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6" podStartSLOduration=1.682417767 podStartE2EDuration="1.682417767s" podCreationTimestamp="2026-04-17 15:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:25:50.680311489 +0000 UTC m=+507.136493584" watchObservedRunningTime="2026-04-17 15:25:50.682417767 +0000 UTC m=+507.138599865" Apr 17 15:26:01.640681 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:01.640648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2dvt6" Apr 17 15:26:01.641128 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:01.640937 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:26:05.022673 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.022632 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44"] Apr 17 15:26:05.023297 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.022932 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" podUID="8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e" containerName="manager" containerID="cri-o://ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9" gracePeriod=10 Apr 17 15:26:05.267868 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.267829 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:26:05.277648 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.277577 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prxwk\" (UniqueName: \"kubernetes.io/projected/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-kube-api-access-prxwk\") pod \"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e\" (UID: \"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e\") " Apr 17 15:26:05.277648 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.277618 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-extensions-socket-volume\") pod \"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e\" (UID: \"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e\") " Apr 17 15:26:05.278031 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.278000 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e" (UID: "8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 15:26:05.279584 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.279567 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-kube-api-access-prxwk" (OuterVolumeSpecName: "kube-api-access-prxwk") pod "8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e" (UID: "8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e"). InnerVolumeSpecName "kube-api-access-prxwk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:26:05.378378 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.378333 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-prxwk\" (UniqueName: \"kubernetes.io/projected/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-kube-api-access-prxwk\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:26:05.378378 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.378370 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e-extensions-socket-volume\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:26:05.682554 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.682467 2576 generic.go:358] "Generic (PLEG): container finished" podID="8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e" containerID="ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9" exitCode=0 Apr 17 15:26:05.682554 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.682507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" event={"ID":"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e","Type":"ContainerDied","Data":"ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9"} Apr 17 15:26:05.682554 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.682526 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" Apr 17 15:26:05.682554 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.682542 2576 scope.go:117] "RemoveContainer" containerID="ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9" Apr 17 15:26:05.682830 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.682532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44" event={"ID":"8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e","Type":"ContainerDied","Data":"616c776afb77978b2a1b4bc3c3e9f843ba9a3af42a44837f39b0134db79cda88"} Apr 17 15:26:05.690624 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.690604 2576 scope.go:117] "RemoveContainer" containerID="ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9" Apr 17 15:26:05.690900 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:26:05.690871 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9\": container with ID starting with ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9 not found: ID does not exist" containerID="ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9" Apr 17 15:26:05.690949 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.690904 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9"} err="failed to get container status \"ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9\": rpc error: code = NotFound desc = could not find container \"ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9\": container with ID starting with ca136616aea1363dba591f7583c4ebdb2d1c09eac3ff21d7c4851b4113659cb9 not found: ID does not exist" Apr 17 15:26:05.703914 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.703888 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44"] Apr 17 15:26:05.707474 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:05.707451 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-thp44"] Apr 17 15:26:06.139305 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:06.139270 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e" path="/var/lib/kubelet/pods/8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e/volumes" Apr 17 15:26:21.235736 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.235698 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z"] Apr 17 15:26:21.236276 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.236004 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e" containerName="manager" Apr 17 15:26:21.236276 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.236019 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e" containerName="manager" Apr 17 15:26:21.236276 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.236103 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e7c565d-6deb-46a1-8a0a-b1a610bcfd0e" containerName="manager" Apr 17 15:26:21.239276 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.239254 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.241768 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.241747 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-k7dbr\"" Apr 17 15:26:21.257082 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.257031 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z"] Apr 17 15:26:21.291061 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.291008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.291232 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.291067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjq9n\" (UniqueName: \"kubernetes.io/projected/e348ae6f-490b-44bf-9b17-4dbd86c14e76-kube-api-access-xjq9n\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.291232 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.291099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.291232 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.291130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.291232 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.291218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.291426 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.291262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.291426 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.291286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.291426 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.291386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.291541 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.291420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.392437 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.392437 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.392681 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.392681 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.392681 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.392681 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjq9n\" (UniqueName: \"kubernetes.io/projected/e348ae6f-490b-44bf-9b17-4dbd86c14e76-kube-api-access-xjq9n\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.392681 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.392927 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.392927 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.392927 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.393112 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.392947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.393112 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.393005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.393297 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.393268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.393367 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.393348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.395022 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.394992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.395229 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.395211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.401509 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.401489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e348ae6f-490b-44bf-9b17-4dbd86c14e76-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.401660 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.401637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjq9n\" (UniqueName: \"kubernetes.io/projected/e348ae6f-490b-44bf-9b17-4dbd86c14e76-kube-api-access-xjq9n\") pod \"maas-default-gateway-openshift-default-845c6b4b48-gl45z\" (UID: \"e348ae6f-490b-44bf-9b17-4dbd86c14e76\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.550508 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.550419 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:21.677602 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.677571 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z"] Apr 17 15:26:21.680458 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:26:21.680427 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode348ae6f_490b_44bf_9b17_4dbd86c14e76.slice/crio-826d4dddec3f4a5b714fab04f625698be96cf00de5159331f45e36e8148b6312 WatchSource:0}: Error finding container 826d4dddec3f4a5b714fab04f625698be96cf00de5159331f45e36e8148b6312: Status 404 returned error can't find the container with id 826d4dddec3f4a5b714fab04f625698be96cf00de5159331f45e36e8148b6312 Apr 17 15:26:21.682899 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.682871 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 15:26:21.682991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.682936 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 15:26:21.682991 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.682963 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 17 15:26:21.734587 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:21.734558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" event={"ID":"e348ae6f-490b-44bf-9b17-4dbd86c14e76","Type":"ContainerStarted","Data":"826d4dddec3f4a5b714fab04f625698be96cf00de5159331f45e36e8148b6312"} Apr 17 15:26:22.739441 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:22.739401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" event={"ID":"e348ae6f-490b-44bf-9b17-4dbd86c14e76","Type":"ContainerStarted","Data":"064479a31d5632c1c6ba9431096808bea712612f45de891efa35fe081d0dfb84"} Apr 17 15:26:22.759882 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:22.759830 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" podStartSLOduration=1.759816061 podStartE2EDuration="1.759816061s" podCreationTimestamp="2026-04-17 15:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:26:22.758790668 +0000 UTC m=+539.214972796" watchObservedRunningTime="2026-04-17 15:26:22.759816061 +0000 UTC m=+539.215998156" Apr 17 15:26:23.551593 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:23.551554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:23.556292 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:23.556261 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:23.743749 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:23.743713 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:23.744668 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:23.744650 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-gl45z" Apr 17 15:26:41.173884 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.173840 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:26:41.176910 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.176888 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:26:41.177040 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.176985 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" Apr 17 15:26:41.179458 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.179432 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-7w66g\"" Apr 17 15:26:41.179603 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.179464 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 15:26:41.185968 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.185939 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:26:41.261072 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.261007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxmd4\" (UniqueName: \"kubernetes.io/projected/647fd0ad-eb25-49e4-9487-4990fe72988d-kube-api-access-nxmd4\") pod \"limitador-limitador-78c99df468-6z98m\" (UID: \"647fd0ad-eb25-49e4-9487-4990fe72988d\") " pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" Apr 17 15:26:41.261253 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.261168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/647fd0ad-eb25-49e4-9487-4990fe72988d-config-file\") pod \"limitador-limitador-78c99df468-6z98m\" (UID: \"647fd0ad-eb25-49e4-9487-4990fe72988d\") " pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" Apr 17 15:26:41.361933 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.361891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxmd4\" (UniqueName: \"kubernetes.io/projected/647fd0ad-eb25-49e4-9487-4990fe72988d-kube-api-access-nxmd4\") pod \"limitador-limitador-78c99df468-6z98m\" (UID: \"647fd0ad-eb25-49e4-9487-4990fe72988d\") " pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" Apr 17 15:26:41.362160 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.361974 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/647fd0ad-eb25-49e4-9487-4990fe72988d-config-file\") pod \"limitador-limitador-78c99df468-6z98m\" (UID: \"647fd0ad-eb25-49e4-9487-4990fe72988d\") " pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" Apr 17 15:26:41.362601 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.362574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/647fd0ad-eb25-49e4-9487-4990fe72988d-config-file\") pod \"limitador-limitador-78c99df468-6z98m\" (UID: \"647fd0ad-eb25-49e4-9487-4990fe72988d\") " pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" Apr 17 15:26:41.370146 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.370123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxmd4\" (UniqueName: \"kubernetes.io/projected/647fd0ad-eb25-49e4-9487-4990fe72988d-kube-api-access-nxmd4\") pod \"limitador-limitador-78c99df468-6z98m\" (UID: \"647fd0ad-eb25-49e4-9487-4990fe72988d\") " pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" Apr 17 15:26:41.487482 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.487441 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" Apr 17 15:26:41.545101 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.544690 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-n5l2k"] Apr 17 15:26:41.549625 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.549479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" Apr 17 15:26:41.554924 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.554897 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-h29t7\"" Apr 17 15:26:41.556115 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.556033 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-n5l2k"] Apr 17 15:26:41.622768 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.622666 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:26:41.625519 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:26:41.625480 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647fd0ad_eb25_49e4_9487_4990fe72988d.slice/crio-c0e20a5080cedc7fac72687cff17d0003bc781ecafb4bf522321885198e80f04 WatchSource:0}: Error finding container c0e20a5080cedc7fac72687cff17d0003bc781ecafb4bf522321885198e80f04: Status 404 returned error can't find the container with id c0e20a5080cedc7fac72687cff17d0003bc781ecafb4bf522321885198e80f04 Apr 17 15:26:41.666394 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.666362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zsxj\" (UniqueName: \"kubernetes.io/projected/26d76022-7599-49fd-8aeb-598d6bdb587f-kube-api-access-6zsxj\") pod \"authorino-f99f4b5cd-n5l2k\" (UID: \"26d76022-7599-49fd-8aeb-598d6bdb587f\") " pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" Apr 17 15:26:41.679340 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.679288 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-dct4n"] Apr 17 15:26:41.682534 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.682518 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dct4n" Apr 17 15:26:41.687615 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.687587 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-dct4n"] Apr 17 15:26:41.767304 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.767217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zsxj\" (UniqueName: \"kubernetes.io/projected/26d76022-7599-49fd-8aeb-598d6bdb587f-kube-api-access-6zsxj\") pod \"authorino-f99f4b5cd-n5l2k\" (UID: \"26d76022-7599-49fd-8aeb-598d6bdb587f\") " pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" Apr 17 15:26:41.767471 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.767306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpbtx\" (UniqueName: \"kubernetes.io/projected/30e3a153-6298-4690-950b-267d89842cc9-kube-api-access-qpbtx\") pod \"authorino-7498df8756-dct4n\" (UID: \"30e3a153-6298-4690-950b-267d89842cc9\") " pod="kuadrant-system/authorino-7498df8756-dct4n" Apr 17 15:26:41.774891 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.774867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zsxj\" (UniqueName: \"kubernetes.io/projected/26d76022-7599-49fd-8aeb-598d6bdb587f-kube-api-access-6zsxj\") pod \"authorino-f99f4b5cd-n5l2k\" (UID: \"26d76022-7599-49fd-8aeb-598d6bdb587f\") " pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" Apr 17 15:26:41.806324 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.806290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" event={"ID":"647fd0ad-eb25-49e4-9487-4990fe72988d","Type":"ContainerStarted","Data":"c0e20a5080cedc7fac72687cff17d0003bc781ecafb4bf522321885198e80f04"} Apr 17 15:26:41.862778 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.862720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" Apr 17 15:26:41.867738 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.867716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpbtx\" (UniqueName: \"kubernetes.io/projected/30e3a153-6298-4690-950b-267d89842cc9-kube-api-access-qpbtx\") pod \"authorino-7498df8756-dct4n\" (UID: \"30e3a153-6298-4690-950b-267d89842cc9\") " pod="kuadrant-system/authorino-7498df8756-dct4n" Apr 17 15:26:41.875704 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.875676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpbtx\" (UniqueName: \"kubernetes.io/projected/30e3a153-6298-4690-950b-267d89842cc9-kube-api-access-qpbtx\") pod \"authorino-7498df8756-dct4n\" (UID: \"30e3a153-6298-4690-950b-267d89842cc9\") " pod="kuadrant-system/authorino-7498df8756-dct4n" Apr 17 15:26:41.993770 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:41.993728 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dct4n" Apr 17 15:26:42.011042 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:42.011011 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-n5l2k"] Apr 17 15:26:42.015177 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:26:42.015133 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26d76022_7599_49fd_8aeb_598d6bdb587f.slice/crio-5032746fbd8e2e480097d1a265066abd0150fc791d6cc5326336b58aecc712fd WatchSource:0}: Error finding container 5032746fbd8e2e480097d1a265066abd0150fc791d6cc5326336b58aecc712fd: Status 404 returned error can't find the container with id 5032746fbd8e2e480097d1a265066abd0150fc791d6cc5326336b58aecc712fd Apr 17 15:26:42.153204 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:42.153170 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-dct4n"] Apr 17 15:26:42.158024 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:26:42.157992 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30e3a153_6298_4690_950b_267d89842cc9.slice/crio-b68abdad5af47f9314165b4a42ad1361c7a3333a5f31d2aecf7d204d50f0c68a WatchSource:0}: Error finding container b68abdad5af47f9314165b4a42ad1361c7a3333a5f31d2aecf7d204d50f0c68a: Status 404 returned error can't find the container with id b68abdad5af47f9314165b4a42ad1361c7a3333a5f31d2aecf7d204d50f0c68a Apr 17 15:26:42.813622 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:42.813582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" event={"ID":"26d76022-7599-49fd-8aeb-598d6bdb587f","Type":"ContainerStarted","Data":"5032746fbd8e2e480097d1a265066abd0150fc791d6cc5326336b58aecc712fd"} Apr 17 15:26:42.816084 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:42.816032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dct4n" event={"ID":"30e3a153-6298-4690-950b-267d89842cc9","Type":"ContainerStarted","Data":"b68abdad5af47f9314165b4a42ad1361c7a3333a5f31d2aecf7d204d50f0c68a"} Apr 17 15:26:45.827384 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:45.827340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" event={"ID":"26d76022-7599-49fd-8aeb-598d6bdb587f","Type":"ContainerStarted","Data":"7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c"} Apr 17 15:26:45.828729 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:45.828704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" event={"ID":"647fd0ad-eb25-49e4-9487-4990fe72988d","Type":"ContainerStarted","Data":"ff6a110343e29633b247f591e2afb040a54a716004514db8e62121927f80935a"} Apr 17 15:26:45.828829 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:45.828768 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" Apr 17 15:26:45.830028 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:45.830006 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dct4n" event={"ID":"30e3a153-6298-4690-950b-267d89842cc9","Type":"ContainerStarted","Data":"f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa"} Apr 17 15:26:45.844093 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:45.844017 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" podStartSLOduration=1.517145996 podStartE2EDuration="4.844003173s" podCreationTimestamp="2026-04-17 15:26:41 +0000 UTC" firstStartedPulling="2026-04-17 15:26:42.016681923 +0000 UTC m=+558.472864010" lastFinishedPulling="2026-04-17 15:26:45.343539107 +0000 UTC m=+561.799721187" observedRunningTime="2026-04-17 15:26:45.842928716 +0000 UTC m=+562.299110812" watchObservedRunningTime="2026-04-17 15:26:45.844003173 +0000 UTC m=+562.300185268" Apr 17 15:26:45.867582 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:45.867530 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-dct4n" podStartSLOduration=1.683859081 podStartE2EDuration="4.86751558s" podCreationTimestamp="2026-04-17 15:26:41 +0000 UTC" firstStartedPulling="2026-04-17 15:26:42.159685606 +0000 UTC m=+558.615867688" lastFinishedPulling="2026-04-17 15:26:45.343342113 +0000 UTC m=+561.799524187" observedRunningTime="2026-04-17 15:26:45.86622382 +0000 UTC m=+562.322405918" watchObservedRunningTime="2026-04-17 15:26:45.86751558 +0000 UTC m=+562.323697675" Apr 17 15:26:45.885518 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:45.885455 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" podStartSLOduration=1.115792521 podStartE2EDuration="4.885437657s" podCreationTimestamp="2026-04-17 15:26:41 +0000 UTC" firstStartedPulling="2026-04-17 15:26:41.627670077 +0000 UTC m=+558.083852166" lastFinishedPulling="2026-04-17 15:26:45.397315211 +0000 UTC m=+561.853497302" observedRunningTime="2026-04-17 15:26:45.884668182 +0000 UTC m=+562.340850280" watchObservedRunningTime="2026-04-17 15:26:45.885437657 +0000 UTC m=+562.341619753" Apr 17 15:26:45.897320 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:45.897291 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-n5l2k"] Apr 17 15:26:47.837177 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:47.837116 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" podUID="26d76022-7599-49fd-8aeb-598d6bdb587f" containerName="authorino" containerID="cri-o://7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c" gracePeriod=30 Apr 17 15:26:48.076727 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.076701 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" Apr 17 15:26:48.225164 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.225130 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zsxj\" (UniqueName: \"kubernetes.io/projected/26d76022-7599-49fd-8aeb-598d6bdb587f-kube-api-access-6zsxj\") pod \"26d76022-7599-49fd-8aeb-598d6bdb587f\" (UID: \"26d76022-7599-49fd-8aeb-598d6bdb587f\") " Apr 17 15:26:48.227216 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.227184 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d76022-7599-49fd-8aeb-598d6bdb587f-kube-api-access-6zsxj" (OuterVolumeSpecName: "kube-api-access-6zsxj") pod "26d76022-7599-49fd-8aeb-598d6bdb587f" (UID: "26d76022-7599-49fd-8aeb-598d6bdb587f"). InnerVolumeSpecName "kube-api-access-6zsxj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:26:48.326374 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.326336 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6zsxj\" (UniqueName: \"kubernetes.io/projected/26d76022-7599-49fd-8aeb-598d6bdb587f-kube-api-access-6zsxj\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:26:48.841982 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.841943 2576 generic.go:358] "Generic (PLEG): container finished" podID="26d76022-7599-49fd-8aeb-598d6bdb587f" containerID="7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c" exitCode=0 Apr 17 15:26:48.842446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.842002 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" Apr 17 15:26:48.842446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.842028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" event={"ID":"26d76022-7599-49fd-8aeb-598d6bdb587f","Type":"ContainerDied","Data":"7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c"} Apr 17 15:26:48.842446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.842079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-n5l2k" event={"ID":"26d76022-7599-49fd-8aeb-598d6bdb587f","Type":"ContainerDied","Data":"5032746fbd8e2e480097d1a265066abd0150fc791d6cc5326336b58aecc712fd"} Apr 17 15:26:48.842446 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.842096 2576 scope.go:117] "RemoveContainer" containerID="7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c" Apr 17 15:26:48.852451 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.852433 2576 scope.go:117] "RemoveContainer" containerID="7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c" Apr 17 15:26:48.852719 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:26:48.852697 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c\": container with ID starting with 7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c not found: ID does not exist" containerID="7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c" Apr 17 15:26:48.852762 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.852730 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c"} err="failed to get container status \"7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c\": rpc error: code = NotFound desc = could not find container \"7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c\": container with ID starting with 7bbb669412096ff91759ba6a7c231b255717e140dc8fe23b57e824361e91d45c not found: ID does not exist" Apr 17 15:26:48.863788 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.863765 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-n5l2k"] Apr 17 15:26:48.866922 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:48.866901 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-n5l2k"] Apr 17 15:26:50.139984 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:50.139954 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d76022-7599-49fd-8aeb-598d6bdb587f" path="/var/lib/kubelet/pods/26d76022-7599-49fd-8aeb-598d6bdb587f/volumes" Apr 17 15:26:56.834817 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:26:56.834789 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-6z98m" Apr 17 15:27:17.516172 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:17.516139 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:27:17.806110 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:17.806001 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ffjbn"] Apr 17 15:27:17.806419 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:17.806400 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d76022-7599-49fd-8aeb-598d6bdb587f" containerName="authorino" Apr 17 15:27:17.806497 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:17.806422 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d76022-7599-49fd-8aeb-598d6bdb587f" containerName="authorino" Apr 17 15:27:17.806550 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:17.806533 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="26d76022-7599-49fd-8aeb-598d6bdb587f" containerName="authorino" Apr 17 15:27:17.809629 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:17.809606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ffjbn" Apr 17 15:27:17.814822 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:17.814795 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ffjbn"] Apr 17 15:27:17.889194 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:17.889162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdd4k\" (UniqueName: \"kubernetes.io/projected/73bde0f5-7d02-466c-9c39-04c76d36d3b9-kube-api-access-hdd4k\") pod \"authorino-8b475cf9f-ffjbn\" (UID: \"73bde0f5-7d02-466c-9c39-04c76d36d3b9\") " pod="kuadrant-system/authorino-8b475cf9f-ffjbn" Apr 17 15:27:17.990312 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:17.990268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdd4k\" (UniqueName: \"kubernetes.io/projected/73bde0f5-7d02-466c-9c39-04c76d36d3b9-kube-api-access-hdd4k\") pod \"authorino-8b475cf9f-ffjbn\" (UID: \"73bde0f5-7d02-466c-9c39-04c76d36d3b9\") " pod="kuadrant-system/authorino-8b475cf9f-ffjbn" Apr 17 15:27:17.998012 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:17.997986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdd4k\" (UniqueName: \"kubernetes.io/projected/73bde0f5-7d02-466c-9c39-04c76d36d3b9-kube-api-access-hdd4k\") pod \"authorino-8b475cf9f-ffjbn\" (UID: \"73bde0f5-7d02-466c-9c39-04c76d36d3b9\") " pod="kuadrant-system/authorino-8b475cf9f-ffjbn" Apr 17 15:27:18.036268 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.036234 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ffjbn"] Apr 17 15:27:18.036520 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.036503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ffjbn" Apr 17 15:27:18.071120 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.066703 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-69f6bfc4b7-vjt66"] Apr 17 15:27:18.072627 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.072597 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69f6bfc4b7-vjt66" Apr 17 15:27:18.078593 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.078567 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-69f6bfc4b7-vjt66"] Apr 17 15:27:18.191507 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.191471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsqs8\" (UniqueName: \"kubernetes.io/projected/0ec78324-073a-4e55-ad80-6ff4db0909bc-kube-api-access-lsqs8\") pod \"authorino-69f6bfc4b7-vjt66\" (UID: \"0ec78324-073a-4e55-ad80-6ff4db0909bc\") " pod="kuadrant-system/authorino-69f6bfc4b7-vjt66" Apr 17 15:27:18.194461 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.194433 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ffjbn"] Apr 17 15:27:18.197642 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:27:18.197616 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73bde0f5_7d02_466c_9c39_04c76d36d3b9.slice/crio-eefc3aad065d12c74c66fc01f3eaa141ee997d33944003ca9135aacfea45f72e WatchSource:0}: Error finding container eefc3aad065d12c74c66fc01f3eaa141ee997d33944003ca9135aacfea45f72e: Status 404 returned error can't find the container with id eefc3aad065d12c74c66fc01f3eaa141ee997d33944003ca9135aacfea45f72e Apr 17 15:27:18.251379 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.251304 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-69f6bfc4b7-vjt66"] Apr 17 15:27:18.251621 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:27:18.251596 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-lsqs8], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-69f6bfc4b7-vjt66" podUID="0ec78324-073a-4e55-ad80-6ff4db0909bc" Apr 17 15:27:18.276664 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.276631 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-64b8458f55-2htbx"] Apr 17 15:27:18.279901 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.279884 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-64b8458f55-2htbx" Apr 17 15:27:18.282386 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.282365 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 15:27:18.290335 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.289924 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-64b8458f55-2htbx"] Apr 17 15:27:18.292466 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.292327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsqs8\" (UniqueName: \"kubernetes.io/projected/0ec78324-073a-4e55-ad80-6ff4db0909bc-kube-api-access-lsqs8\") pod \"authorino-69f6bfc4b7-vjt66\" (UID: \"0ec78324-073a-4e55-ad80-6ff4db0909bc\") " pod="kuadrant-system/authorino-69f6bfc4b7-vjt66" Apr 17 15:27:18.300953 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.300926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsqs8\" (UniqueName: \"kubernetes.io/projected/0ec78324-073a-4e55-ad80-6ff4db0909bc-kube-api-access-lsqs8\") pod \"authorino-69f6bfc4b7-vjt66\" (UID: \"0ec78324-073a-4e55-ad80-6ff4db0909bc\") " pod="kuadrant-system/authorino-69f6bfc4b7-vjt66" Apr 17 15:27:18.393920 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.393808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lhv9\" (UniqueName: \"kubernetes.io/projected/ca6d739a-a898-45a9-93ef-15d4690df060-kube-api-access-5lhv9\") pod \"authorino-64b8458f55-2htbx\" (UID: \"ca6d739a-a898-45a9-93ef-15d4690df060\") " pod="kuadrant-system/authorino-64b8458f55-2htbx" Apr 17 15:27:18.393920 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.393878 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ca6d739a-a898-45a9-93ef-15d4690df060-tls-cert\") pod \"authorino-64b8458f55-2htbx\" (UID: \"ca6d739a-a898-45a9-93ef-15d4690df060\") " pod="kuadrant-system/authorino-64b8458f55-2htbx" Apr 17 15:27:18.494671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.494618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lhv9\" (UniqueName: \"kubernetes.io/projected/ca6d739a-a898-45a9-93ef-15d4690df060-kube-api-access-5lhv9\") pod \"authorino-64b8458f55-2htbx\" (UID: \"ca6d739a-a898-45a9-93ef-15d4690df060\") " pod="kuadrant-system/authorino-64b8458f55-2htbx" Apr 17 15:27:18.494671 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.494677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ca6d739a-a898-45a9-93ef-15d4690df060-tls-cert\") pod \"authorino-64b8458f55-2htbx\" (UID: \"ca6d739a-a898-45a9-93ef-15d4690df060\") " pod="kuadrant-system/authorino-64b8458f55-2htbx" Apr 17 15:27:18.497106 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.497077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ca6d739a-a898-45a9-93ef-15d4690df060-tls-cert\") pod \"authorino-64b8458f55-2htbx\" (UID: \"ca6d739a-a898-45a9-93ef-15d4690df060\") " pod="kuadrant-system/authorino-64b8458f55-2htbx" Apr 17 15:27:18.501726 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.501703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lhv9\" (UniqueName: \"kubernetes.io/projected/ca6d739a-a898-45a9-93ef-15d4690df060-kube-api-access-5lhv9\") pod \"authorino-64b8458f55-2htbx\" (UID: \"ca6d739a-a898-45a9-93ef-15d4690df060\") " pod="kuadrant-system/authorino-64b8458f55-2htbx" Apr 17 15:27:18.594895 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.594872 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-64b8458f55-2htbx" Apr 17 15:27:18.766033 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.766006 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-64b8458f55-2htbx"] Apr 17 15:27:18.768570 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:27:18.768528 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca6d739a_a898_45a9_93ef_15d4690df060.slice/crio-a17fe7a177c244b5e99a12daf87c7a528fa8b3e2c286db462a2c62b7f745b971 WatchSource:0}: Error finding container a17fe7a177c244b5e99a12daf87c7a528fa8b3e2c286db462a2c62b7f745b971: Status 404 returned error can't find the container with id a17fe7a177c244b5e99a12daf87c7a528fa8b3e2c286db462a2c62b7f745b971 Apr 17 15:27:18.943536 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.943429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-64b8458f55-2htbx" event={"ID":"ca6d739a-a898-45a9-93ef-15d4690df060","Type":"ContainerStarted","Data":"a17fe7a177c244b5e99a12daf87c7a528fa8b3e2c286db462a2c62b7f745b971"} Apr 17 15:27:18.946022 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.945229 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69f6bfc4b7-vjt66" Apr 17 15:27:18.946022 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.945690 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-ffjbn" podUID="73bde0f5-7d02-466c-9c39-04c76d36d3b9" containerName="authorino" containerID="cri-o://e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6" gracePeriod=30 Apr 17 15:27:18.946022 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.945711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-ffjbn" event={"ID":"73bde0f5-7d02-466c-9c39-04c76d36d3b9","Type":"ContainerStarted","Data":"e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6"} Apr 17 15:27:18.946022 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.945747 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-ffjbn" event={"ID":"73bde0f5-7d02-466c-9c39-04c76d36d3b9","Type":"ContainerStarted","Data":"eefc3aad065d12c74c66fc01f3eaa141ee997d33944003ca9135aacfea45f72e"} Apr 17 15:27:18.951325 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.951299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69f6bfc4b7-vjt66" Apr 17 15:27:18.960096 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.960030 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-ffjbn" podStartSLOduration=1.617513975 podStartE2EDuration="1.96001802s" podCreationTimestamp="2026-04-17 15:27:17 +0000 UTC" firstStartedPulling="2026-04-17 15:27:18.198855374 +0000 UTC m=+594.655037447" lastFinishedPulling="2026-04-17 15:27:18.541359402 +0000 UTC m=+594.997541492" observedRunningTime="2026-04-17 15:27:18.959249482 +0000 UTC m=+595.415431773" watchObservedRunningTime="2026-04-17 15:27:18.96001802 +0000 UTC m=+595.416200151" Apr 17 15:27:18.997618 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.997580 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsqs8\" (UniqueName: \"kubernetes.io/projected/0ec78324-073a-4e55-ad80-6ff4db0909bc-kube-api-access-lsqs8\") pod \"0ec78324-073a-4e55-ad80-6ff4db0909bc\" (UID: \"0ec78324-073a-4e55-ad80-6ff4db0909bc\") " Apr 17 15:27:18.999953 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:18.999922 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec78324-073a-4e55-ad80-6ff4db0909bc-kube-api-access-lsqs8" (OuterVolumeSpecName: "kube-api-access-lsqs8") pod "0ec78324-073a-4e55-ad80-6ff4db0909bc" (UID: "0ec78324-073a-4e55-ad80-6ff4db0909bc"). InnerVolumeSpecName "kube-api-access-lsqs8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:27:19.098894 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.098860 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lsqs8\" (UniqueName: \"kubernetes.io/projected/0ec78324-073a-4e55-ad80-6ff4db0909bc-kube-api-access-lsqs8\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:27:19.220378 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.220359 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ffjbn" Apr 17 15:27:19.300240 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.300196 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdd4k\" (UniqueName: \"kubernetes.io/projected/73bde0f5-7d02-466c-9c39-04c76d36d3b9-kube-api-access-hdd4k\") pod \"73bde0f5-7d02-466c-9c39-04c76d36d3b9\" (UID: \"73bde0f5-7d02-466c-9c39-04c76d36d3b9\") " Apr 17 15:27:19.302299 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.302181 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73bde0f5-7d02-466c-9c39-04c76d36d3b9-kube-api-access-hdd4k" (OuterVolumeSpecName: "kube-api-access-hdd4k") pod "73bde0f5-7d02-466c-9c39-04c76d36d3b9" (UID: "73bde0f5-7d02-466c-9c39-04c76d36d3b9"). InnerVolumeSpecName "kube-api-access-hdd4k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:27:19.401130 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.401098 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hdd4k\" (UniqueName: \"kubernetes.io/projected/73bde0f5-7d02-466c-9c39-04c76d36d3b9-kube-api-access-hdd4k\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:27:19.949979 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.949939 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-64b8458f55-2htbx" event={"ID":"ca6d739a-a898-45a9-93ef-15d4690df060","Type":"ContainerStarted","Data":"fefc87bdf652b257ab6d84139314cef05ebcd6f597ec164d6f44b4006808db72"} Apr 17 15:27:19.951136 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.951104 2576 generic.go:358] "Generic (PLEG): container finished" podID="73bde0f5-7d02-466c-9c39-04c76d36d3b9" containerID="e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6" exitCode=0 Apr 17 15:27:19.951265 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.951147 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-ffjbn" Apr 17 15:27:19.951265 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.951193 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-ffjbn" event={"ID":"73bde0f5-7d02-466c-9c39-04c76d36d3b9","Type":"ContainerDied","Data":"e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6"} Apr 17 15:27:19.951265 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.951225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-ffjbn" event={"ID":"73bde0f5-7d02-466c-9c39-04c76d36d3b9","Type":"ContainerDied","Data":"eefc3aad065d12c74c66fc01f3eaa141ee997d33944003ca9135aacfea45f72e"} Apr 17 15:27:19.951265 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.951244 2576 scope.go:117] "RemoveContainer" containerID="e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6" Apr 17 15:27:19.951265 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.951252 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69f6bfc4b7-vjt66" Apr 17 15:27:19.959538 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.959519 2576 scope.go:117] "RemoveContainer" containerID="e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6" Apr 17 15:27:19.959826 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:27:19.959784 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6\": container with ID starting with e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6 not found: ID does not exist" containerID="e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6" Apr 17 15:27:19.959897 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.959822 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6"} err="failed to get container status \"e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6\": rpc error: code = NotFound desc = could not find container \"e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6\": container with ID starting with e4416e947be71237c73a2e46534d74fe8eadfd89c33761b95bc3f91ca05013e6 not found: ID does not exist" Apr 17 15:27:19.964345 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.964306 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-64b8458f55-2htbx" podStartSLOduration=1.620793838 podStartE2EDuration="1.964293391s" podCreationTimestamp="2026-04-17 15:27:18 +0000 UTC" firstStartedPulling="2026-04-17 15:27:18.769886002 +0000 UTC m=+595.226068077" lastFinishedPulling="2026-04-17 15:27:19.113385553 +0000 UTC m=+595.569567630" observedRunningTime="2026-04-17 15:27:19.963651127 +0000 UTC m=+596.419833219" watchObservedRunningTime="2026-04-17 15:27:19.964293391 +0000 UTC m=+596.420475511" Apr 17 15:27:19.988205 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.988175 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-69f6bfc4b7-vjt66"] Apr 17 15:27:19.993700 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.993672 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-dct4n"] Apr 17 15:27:19.993932 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.993890 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-dct4n" podUID="30e3a153-6298-4690-950b-267d89842cc9" containerName="authorino" containerID="cri-o://f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa" gracePeriod=30 Apr 17 15:27:19.995462 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:19.995439 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-69f6bfc4b7-vjt66"] Apr 17 15:27:20.007300 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.007277 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ffjbn"] Apr 17 15:27:20.010018 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.009989 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-ffjbn"] Apr 17 15:27:20.147923 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.147888 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec78324-073a-4e55-ad80-6ff4db0909bc" path="/var/lib/kubelet/pods/0ec78324-073a-4e55-ad80-6ff4db0909bc/volumes" Apr 17 15:27:20.148307 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.148285 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73bde0f5-7d02-466c-9c39-04c76d36d3b9" path="/var/lib/kubelet/pods/73bde0f5-7d02-466c-9c39-04c76d36d3b9/volumes" Apr 17 15:27:20.257956 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.257931 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dct4n" Apr 17 15:27:20.307565 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.307532 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpbtx\" (UniqueName: \"kubernetes.io/projected/30e3a153-6298-4690-950b-267d89842cc9-kube-api-access-qpbtx\") pod \"30e3a153-6298-4690-950b-267d89842cc9\" (UID: \"30e3a153-6298-4690-950b-267d89842cc9\") " Apr 17 15:27:20.309670 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.309637 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e3a153-6298-4690-950b-267d89842cc9-kube-api-access-qpbtx" (OuterVolumeSpecName: "kube-api-access-qpbtx") pod "30e3a153-6298-4690-950b-267d89842cc9" (UID: "30e3a153-6298-4690-950b-267d89842cc9"). InnerVolumeSpecName "kube-api-access-qpbtx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 15:27:20.408171 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.408135 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qpbtx\" (UniqueName: \"kubernetes.io/projected/30e3a153-6298-4690-950b-267d89842cc9-kube-api-access-qpbtx\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 17 15:27:20.955522 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.955487 2576 generic.go:358] "Generic (PLEG): container finished" podID="30e3a153-6298-4690-950b-267d89842cc9" containerID="f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa" exitCode=0 Apr 17 15:27:20.956009 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.955538 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dct4n" Apr 17 15:27:20.956009 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.955561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dct4n" event={"ID":"30e3a153-6298-4690-950b-267d89842cc9","Type":"ContainerDied","Data":"f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa"} Apr 17 15:27:20.956009 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.955596 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dct4n" event={"ID":"30e3a153-6298-4690-950b-267d89842cc9","Type":"ContainerDied","Data":"b68abdad5af47f9314165b4a42ad1361c7a3333a5f31d2aecf7d204d50f0c68a"} Apr 17 15:27:20.956009 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.955614 2576 scope.go:117] "RemoveContainer" containerID="f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa" Apr 17 15:27:20.963888 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.963868 2576 scope.go:117] "RemoveContainer" containerID="f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa" Apr 17 15:27:20.964179 ip-10-0-133-75 kubenswrapper[2576]: E0417 15:27:20.964157 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa\": container with ID starting with f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa not found: ID does not exist" containerID="f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa" Apr 17 15:27:20.964251 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.964191 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa"} err="failed to get container status \"f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa\": rpc error: code = NotFound desc = could not find container \"f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa\": container with ID starting with f6bd2b78fb706d0b8d6e160298a504f34c0f7c30204af73ae48055f288e3e6aa not found: ID does not exist" Apr 17 15:27:20.977682 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.977654 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-dct4n"] Apr 17 15:27:20.981075 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:20.981036 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-dct4n"] Apr 17 15:27:22.145473 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:22.145414 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e3a153-6298-4690-950b-267d89842cc9" path="/var/lib/kubelet/pods/30e3a153-6298-4690-950b-267d89842cc9/volumes" Apr 17 15:27:24.027460 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:24.027434 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:27:24.027834 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:24.027655 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:27:56.415288 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:27:56.415251 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:28:04.890375 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:28:04.890336 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:28:18.869809 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:28:18.869770 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:28:23.591281 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:28:23.591241 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:28:41.170529 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:28:41.170488 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:28:55.766261 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:28:55.766226 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:29:00.273326 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:29:00.273291 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-6z98m"] Apr 17 15:32:24.049648 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:24.049572 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:32:24.051703 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:24.051675 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:32:49.598396 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:49.598350 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-64b8458f55-2htbx_ca6d739a-a898-45a9-93ef-15d4690df060/authorino/0.log" Apr 17 15:32:54.360398 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:54.360362 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9bd7bdf77-zwx8q_1e209b7e-8e46-481b-8393-68e4d8fdb20e/manager/0.log" Apr 17 15:32:55.805848 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:55.805811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-64b8458f55-2htbx_ca6d739a-a898-45a9-93ef-15d4690df060/authorino/0.log" Apr 17 15:32:56.030226 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:56.030198 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-hqwc5_f9137b3e-b1b6-4137-a433-6fce03d62db1/manager/0.log" Apr 17 15:32:56.260232 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:56.260204 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-sqmgh_078d1eb0-c79c-4ff5-a0db-61196782bdce/registry-server/0.log" Apr 17 15:32:56.488438 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:56.488411 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-6z98m_647fd0ad-eb25-49e4-9487-4990fe72988d/limitador/0.log" Apr 17 15:32:56.600239 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:56.600157 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-2dvt6_277c2299-36eb-43a7-a599-3a6d75557b5f/manager/0.log" Apr 17 15:32:56.951643 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:56.951567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f8m7jp_c2f2fc86-6d29-4540-840d-05d2dba28fae/istio-proxy/0.log" Apr 17 15:32:57.411242 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:32:57.411213 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-gl45z_e348ae6f-490b-44bf-9b17-4dbd86c14e76/istio-proxy/0.log" Apr 17 15:33:05.665370 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:05.665342 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7gp9s_fd2172f8-51a1-4324-8352-f33756fe0535/global-pull-secret-syncer/0.log" Apr 17 15:33:05.788573 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:05.788539 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-jgfqk_f4299c8c-3050-4ce9-9766-13f14ff297a7/konnectivity-agent/0.log" Apr 17 15:33:05.913971 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:05.913938 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-75.ec2.internal_5401552a10b9bd31fa1f4a18dcace9bb/haproxy/0.log" Apr 17 15:33:09.997574 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:09.997544 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-64b8458f55-2htbx_ca6d739a-a898-45a9-93ef-15d4690df060/authorino/0.log" Apr 17 15:33:10.046790 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:10.046761 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-hqwc5_f9137b3e-b1b6-4137-a433-6fce03d62db1/manager/0.log" Apr 17 15:33:10.105235 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:10.105196 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-sqmgh_078d1eb0-c79c-4ff5-a0db-61196782bdce/registry-server/0.log" Apr 17 15:33:10.181885 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:10.181860 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-6z98m_647fd0ad-eb25-49e4-9487-4990fe72988d/limitador/0.log" Apr 17 15:33:10.213102 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:10.213073 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-2dvt6_277c2299-36eb-43a7-a599-3a6d75557b5f/manager/0.log" Apr 17 15:33:11.950034 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:11.949990 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-tw2s4_28a3e5c6-6810-4b5f-8dbf-cac934703031/monitoring-plugin/0.log" Apr 17 15:33:12.074471 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:12.074425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dxplq_40471bd1-f59d-42eb-84db-1b5f51c4f3e8/node-exporter/0.log" Apr 17 15:33:12.094105 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:12.094071 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dxplq_40471bd1-f59d-42eb-84db-1b5f51c4f3e8/kube-rbac-proxy/0.log" Apr 17 15:33:12.115013 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:12.114990 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dxplq_40471bd1-f59d-42eb-84db-1b5f51c4f3e8/init-textfile/0.log" Apr 17 15:33:12.491321 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:12.491289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-w4mf6_3a5e95c4-7657-40ac-9a51-cd8077d947ec/prometheus-operator/0.log" Apr 17 15:33:12.509371 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:12.509330 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-w4mf6_3a5e95c4-7657-40ac-9a51-cd8077d947ec/kube-rbac-proxy/0.log" Apr 17 15:33:14.289068 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.289008 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k"] Apr 17 15:33:14.289484 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.289363 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30e3a153-6298-4690-950b-267d89842cc9" containerName="authorino" Apr 17 15:33:14.289484 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.289376 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e3a153-6298-4690-950b-267d89842cc9" containerName="authorino" Apr 17 15:33:14.289484 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.289384 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73bde0f5-7d02-466c-9c39-04c76d36d3b9" containerName="authorino" Apr 17 15:33:14.289484 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.289389 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="73bde0f5-7d02-466c-9c39-04c76d36d3b9" containerName="authorino" Apr 17 15:33:14.289484 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.289433 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="30e3a153-6298-4690-950b-267d89842cc9" containerName="authorino" Apr 17 15:33:14.289484 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.289442 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="73bde0f5-7d02-466c-9c39-04c76d36d3b9" containerName="authorino" Apr 17 15:33:14.292403 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.292381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.294836 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.294815 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-79652\"/\"default-dockercfg-zpffh\"" Apr 17 15:33:14.294981 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.294856 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-79652\"/\"openshift-service-ca.crt\"" Apr 17 15:33:14.294981 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.294856 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-79652\"/\"kube-root-ca.crt\"" Apr 17 15:33:14.301659 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.301641 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k"] Apr 17 15:33:14.414853 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.414817 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cng7n\" (UniqueName: \"kubernetes.io/projected/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-kube-api-access-cng7n\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.415042 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.414879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-proc\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.415042 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.414975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-lib-modules\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.415042 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.415035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-sys\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.415176 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.415089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-podres\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.515560 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.515526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cng7n\" (UniqueName: \"kubernetes.io/projected/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-kube-api-access-cng7n\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.515752 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.515579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-proc\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.515752 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.515606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-lib-modules\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.515752 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.515637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-sys\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.515752 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.515662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-podres\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.515752 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.515689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-proc\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.515950 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.515756 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-sys\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.515950 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.515767 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-podres\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.515950 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.515788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-lib-modules\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.524649 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.524620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cng7n\" (UniqueName: \"kubernetes.io/projected/eb798e2f-a6ed-47dc-accf-825d0b91a2ab-kube-api-access-cng7n\") pod \"perf-node-gather-daemonset-kvc6k\" (UID: \"eb798e2f-a6ed-47dc-accf-825d0b91a2ab\") " pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.604555 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.604474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:14.724411 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.724382 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k"] Apr 17 15:33:14.727074 ip-10-0-133-75 kubenswrapper[2576]: W0417 15:33:14.727028 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podeb798e2f_a6ed_47dc_accf_825d0b91a2ab.slice/crio-7cd30c6ac73ff04187978d0a23678e30160b494b0f262a54012456dec834e823 WatchSource:0}: Error finding container 7cd30c6ac73ff04187978d0a23678e30160b494b0f262a54012456dec834e823: Status 404 returned error can't find the container with id 7cd30c6ac73ff04187978d0a23678e30160b494b0f262a54012456dec834e823 Apr 17 15:33:14.728590 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.728570 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 15:33:14.921001 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.920918 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6974f5cc54-56jgv_cde72007-672b-4a83-9ebe-149f70e6e122/console/0.log" Apr 17 15:33:14.948185 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:14.948153 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-xt8cx_90902de8-0523-47d2-bac0-8249a7985ae8/download-server/0.log" Apr 17 15:33:15.118818 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:15.118779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" event={"ID":"eb798e2f-a6ed-47dc-accf-825d0b91a2ab","Type":"ContainerStarted","Data":"9fbf54bc61978d76f7b0a9adc7697eeecf124d4ffb564cf4ad676f9d6a25cfdc"} Apr 17 15:33:15.118818 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:15.118820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" event={"ID":"eb798e2f-a6ed-47dc-accf-825d0b91a2ab","Type":"ContainerStarted","Data":"7cd30c6ac73ff04187978d0a23678e30160b494b0f262a54012456dec834e823"} Apr 17 15:33:15.119029 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:15.118857 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:15.133403 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:15.133354 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" podStartSLOduration=1.133338306 podStartE2EDuration="1.133338306s" podCreationTimestamp="2026-04-17 15:33:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 15:33:15.132312926 +0000 UTC m=+951.588495024" watchObservedRunningTime="2026-04-17 15:33:15.133338306 +0000 UTC m=+951.589520401" Apr 17 15:33:16.266461 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:16.266435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dw8kz_37750dd6-91d9-43c3-a3ac-75a2c4ce6eec/dns/0.log" Apr 17 15:33:16.284953 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:16.284931 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dw8kz_37750dd6-91d9-43c3-a3ac-75a2c4ce6eec/kube-rbac-proxy/0.log" Apr 17 15:33:16.335067 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:16.335027 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bpw5n_e70e8fd7-f8f2-4303-8371-1696921c6746/dns-node-resolver/0.log" Apr 17 15:33:16.852034 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:16.852005 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wqrkq_a00974bb-abc9-4285-909c-842f9c69b1f3/node-ca/0.log" Apr 17 15:33:17.699484 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:17.699456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f8m7jp_c2f2fc86-6d29-4540-840d-05d2dba28fae/istio-proxy/0.log" Apr 17 15:33:17.831405 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:17.831374 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-gl45z_e348ae6f-490b-44bf-9b17-4dbd86c14e76/istio-proxy/0.log" Apr 17 15:33:18.311161 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:18.311131 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4g86w_b80c4e77-d795-4111-a247-f612ad85f926/serve-healthcheck-canary/0.log" Apr 17 15:33:18.892668 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:18.892638 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xs229_2fbe12c0-d02c-498e-96a3-2d9911087940/kube-rbac-proxy/0.log" Apr 17 15:33:18.912137 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:18.912110 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xs229_2fbe12c0-d02c-498e-96a3-2d9911087940/exporter/0.log" Apr 17 15:33:18.930612 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:18.930587 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xs229_2fbe12c0-d02c-498e-96a3-2d9911087940/extractor/0.log" Apr 17 15:33:20.882014 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:20.881952 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9bd7bdf77-zwx8q_1e209b7e-8e46-481b-8393-68e4d8fdb20e/manager/0.log" Apr 17 15:33:21.133144 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:21.133038 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-79652/perf-node-gather-daemonset-kvc6k" Apr 17 15:33:21.961284 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:21.961254 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-59bc47b496-68z8w_57c6941a-9760-4c68-ad71-47b739189881/manager/0.log" Apr 17 15:33:27.686813 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:27.686785 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ffhn2_fc69e676-8342-4380-a1ba-56fbb970d9d9/kube-multus-additional-cni-plugins/0.log" Apr 17 15:33:27.707229 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:27.707201 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ffhn2_fc69e676-8342-4380-a1ba-56fbb970d9d9/egress-router-binary-copy/0.log" Apr 17 15:33:27.727003 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:27.726976 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ffhn2_fc69e676-8342-4380-a1ba-56fbb970d9d9/cni-plugins/0.log" Apr 17 15:33:27.747170 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:27.747147 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ffhn2_fc69e676-8342-4380-a1ba-56fbb970d9d9/bond-cni-plugin/0.log" Apr 17 15:33:27.765443 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:27.765417 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ffhn2_fc69e676-8342-4380-a1ba-56fbb970d9d9/routeoverride-cni/0.log" Apr 17 15:33:27.783136 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:27.783113 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ffhn2_fc69e676-8342-4380-a1ba-56fbb970d9d9/whereabouts-cni-bincopy/0.log" Apr 17 15:33:27.801364 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:27.801342 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ffhn2_fc69e676-8342-4380-a1ba-56fbb970d9d9/whereabouts-cni/0.log" Apr 17 15:33:28.004668 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:28.004637 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vndw6_a89cc04d-c377-4ac2-9120-63ebc1ca2990/kube-multus/0.log" Apr 17 15:33:28.050866 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:28.050825 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-82cq8_41bb8b03-e874-455f-8416-b76d91f0f117/network-metrics-daemon/0.log" Apr 17 15:33:28.069918 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:28.069881 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-82cq8_41bb8b03-e874-455f-8416-b76d91f0f117/kube-rbac-proxy/0.log" Apr 17 15:33:29.236656 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:29.236610 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-controller/0.log" Apr 17 15:33:29.254161 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:29.254136 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/0.log" Apr 17 15:33:29.258689 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:29.258668 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovn-acl-logging/1.log" Apr 17 15:33:29.275116 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:29.275086 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/kube-rbac-proxy-node/0.log" Apr 17 15:33:29.306464 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:29.306425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 15:33:29.339000 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:29.338966 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/northd/0.log" Apr 17 15:33:29.357591 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:29.357567 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/nbdb/0.log" Apr 17 15:33:29.377632 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:29.377608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/sbdb/0.log" Apr 17 15:33:29.472626 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:29.472589 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7p8ns_1e3cfe5e-0c86-4d14-ac14-7390274f338b/ovnkube-controller/0.log" Apr 17 15:33:30.845379 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:30.845349 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-n2x89_7400fb35-d1c0-4009-bdc4-483256d99f9d/network-check-target-container/0.log" Apr 17 15:33:31.892093 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:31.892041 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2vktv_6e474365-0ff7-4228-b6b7-3f49bc17a45b/iptables-alerter/0.log" Apr 17 15:33:32.595912 ip-10-0-133-75 kubenswrapper[2576]: I0417 15:33:32.595883 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-7sgdp_03062b60-45de-4e91-92e7-3959d5322bd1/tuned/0.log"