Apr 22 21:06:31.367293 ip-10-0-130-19 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 21:06:31.367306 ip-10-0-130-19 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 21:06:31.367316 ip-10-0-130-19 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 21:06:31.367610 ip-10-0-130-19 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 21:06:41.433917 ip-10-0-130-19 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 21:06:41.433933 ip-10-0-130-19 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 4a01bd80ddb84999aab32cbe8c0e275b -- Apr 22 21:08:41.758580 ip-10-0-130-19 systemd[1]: Starting Kubernetes Kubelet... Apr 22 21:08:42.261629 ip-10-0-130-19 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:08:42.261629 ip-10-0-130-19 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 21:08:42.261629 ip-10-0-130-19 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:08:42.261629 ip-10-0-130-19 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 21:08:42.261629 ip-10-0-130-19 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:08:42.265049 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.264964 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 21:08:42.272319 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272300 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:08:42.272319 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272319 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272324 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272328 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272331 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272335 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272338 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272341 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272344 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272346 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272349 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272352 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272355 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272358 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272360 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272363 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272366 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272369 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272371 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272374 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272376 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:08:42.272410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272379 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272381 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272384 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272386 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272389 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272392 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272395 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272404 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272407 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272410 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272413 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272417 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272420 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272422 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272425 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272428 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272431 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272433 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272436 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272438 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:08:42.272880 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272441 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272443 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272446 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272448 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272451 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272453 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272455 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272458 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272460 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272463 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272466 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272468 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272471 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272473 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272479 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272483 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272486 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272489 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272492 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:08:42.273461 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272495 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272497 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272500 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272503 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272506 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272508 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272511 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272514 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272517 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272519 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272522 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272524 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272528 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272530 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272533 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272536 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272538 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272545 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272550 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272553 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:08:42.273923 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272556 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272558 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272561 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272563 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272566 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272569 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272955 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272960 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272963 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272966 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272969 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272972 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272975 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272977 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272980 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272983 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272985 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272988 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272990 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:08:42.274427 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272993 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.272997 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273001 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273004 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273007 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273010 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273012 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273015 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273017 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273020 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273022 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273025 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273028 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273031 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273034 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273036 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273039 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273041 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273044 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273047 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:08:42.274888 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273050 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273053 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273056 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273059 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273061 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273064 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273066 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273069 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273072 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273074 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273077 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273079 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273082 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273084 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273087 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273089 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273092 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273094 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273097 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273099 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:08:42.275405 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273102 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273104 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273107 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273109 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273113 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273116 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273118 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273121 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273123 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273126 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273129 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273131 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273134 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273137 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273139 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273142 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273144 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273147 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273150 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:08:42.275939 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273152 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273156 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273160 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273164 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273166 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273169 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273172 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273174 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273177 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273179 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273181 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273184 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273186 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273189 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273273 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273280 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273287 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273295 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273301 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273304 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273308 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 21:08:42.276432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273314 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273317 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273320 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273324 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273327 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273330 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273333 2570 flags.go:64] FLAG: --cgroup-root="" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273336 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273339 2570 flags.go:64] FLAG: --client-ca-file="" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273342 2570 flags.go:64] FLAG: --cloud-config="" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273345 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273348 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273352 2570 flags.go:64] FLAG: --cluster-domain="" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273355 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273359 2570 flags.go:64] FLAG: --config-dir="" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273362 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273365 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273369 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273372 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273375 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273378 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273382 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273385 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273388 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273391 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 21:08:42.276943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273393 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273398 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273401 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273406 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273409 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273412 2570 flags.go:64] FLAG: --enable-server="true" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273415 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273419 2570 flags.go:64] FLAG: --event-burst="100" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273422 2570 flags.go:64] FLAG: --event-qps="50" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273425 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273429 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273432 2570 flags.go:64] FLAG: --eviction-hard="" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273435 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273438 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273441 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273444 2570 flags.go:64] FLAG: --eviction-soft="" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273447 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273450 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273454 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273457 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273460 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273463 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273466 2570 flags.go:64] FLAG: --feature-gates="" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273470 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273473 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 21:08:42.277557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273477 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273480 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273483 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273486 2570 flags.go:64] FLAG: --help="false" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273489 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273492 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273495 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273498 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273502 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273505 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273509 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273512 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273515 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273518 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273521 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273524 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273527 2570 flags.go:64] FLAG: --kube-reserved="" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273530 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273533 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273536 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273539 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273542 2570 flags.go:64] FLAG: --lock-file="" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273544 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273547 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 21:08:42.278177 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273550 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273557 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273560 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273563 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273566 2570 flags.go:64] FLAG: --logging-format="text" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273569 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273572 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273575 2570 flags.go:64] FLAG: --manifest-url="" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273578 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273582 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273585 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273589 2570 flags.go:64] FLAG: --max-pods="110" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273592 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273595 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273598 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273601 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273604 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273607 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273611 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273619 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273622 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273625 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273629 2570 flags.go:64] FLAG: --pod-cidr="" Apr 22 21:08:42.278789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273632 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273637 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273640 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273643 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273646 2570 flags.go:64] FLAG: --port="10250" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273649 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273652 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0863364cb64da6eb1" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273655 2570 flags.go:64] FLAG: --qos-reserved="" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273658 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273661 2570 flags.go:64] FLAG: --register-node="true" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273665 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273668 2570 flags.go:64] FLAG: --register-with-taints="" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273672 2570 flags.go:64] FLAG: --registry-burst="10" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273675 2570 flags.go:64] FLAG: --registry-qps="5" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273677 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273680 2570 flags.go:64] FLAG: --reserved-memory="" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273684 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273687 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273690 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273693 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273696 2570 flags.go:64] FLAG: --runonce="false" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273699 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273702 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273705 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273708 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273711 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 21:08:42.279355 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273714 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273717 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273721 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273724 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273727 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273730 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273733 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273736 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273739 2570 flags.go:64] FLAG: --system-cgroups="" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273742 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273747 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273750 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273753 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273757 2570 flags.go:64] FLAG: --tls-min-version="" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273760 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273763 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273767 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273770 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273773 2570 flags.go:64] FLAG: --v="2" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273778 2570 flags.go:64] FLAG: --version="false" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273782 2570 flags.go:64] FLAG: --vmodule="" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273786 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.273789 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273882 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:08:42.279988 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273885 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273888 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273891 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273893 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273897 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273899 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273902 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273905 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273907 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273910 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273914 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273916 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273920 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273924 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273927 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273931 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273934 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273937 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273941 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:08:42.280587 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273945 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273948 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273951 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273954 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273957 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273962 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273965 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273968 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273971 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273973 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273976 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273979 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273982 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273984 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273987 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273990 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273992 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273995 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.273997 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274000 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:08:42.281099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274002 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274005 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274007 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274011 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274013 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274016 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274019 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274021 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274024 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274027 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274029 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274032 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274035 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274037 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274040 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274042 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274045 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274049 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274051 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274054 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:08:42.281605 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274056 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274059 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274061 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274064 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274066 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274069 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274071 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274074 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274076 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274079 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274081 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274084 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274086 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274089 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274091 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274095 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274098 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274100 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274103 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274105 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:08:42.282099 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274108 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274110 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274113 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274116 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274118 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.274121 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.275049 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.281859 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.281877 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281926 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281932 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281936 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281939 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281942 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281945 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281948 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:08:42.282619 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281950 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281953 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281956 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281958 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281961 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281964 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281967 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281969 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281972 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281974 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281977 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281979 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281982 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281985 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281987 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281990 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281992 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281996 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.281999 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:08:42.283018 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282001 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282004 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282006 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282009 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282012 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282016 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282019 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282021 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282024 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282026 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282029 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282032 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282034 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282037 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282039 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282042 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282045 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282047 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282049 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282052 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:08:42.283505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282055 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282057 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282060 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282063 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282065 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282068 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282070 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282074 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282079 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282082 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282085 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282088 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282091 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282094 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282096 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282099 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282102 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282105 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282108 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:08:42.284003 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282111 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282114 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282117 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282119 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282122 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282124 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282127 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282129 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282132 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282135 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282137 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282141 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282145 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282148 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282150 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282153 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282156 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282158 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282161 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282163 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:08:42.284474 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282166 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.282171 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282291 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282296 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282299 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282302 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282305 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282308 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282311 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282314 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282317 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282320 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282323 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282326 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282329 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:08:42.284985 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282332 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282334 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282337 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282340 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282342 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282345 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282347 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282350 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282352 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282354 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282357 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282360 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282362 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282365 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282367 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282369 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282372 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282374 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282377 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282379 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:08:42.285369 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282383 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282385 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282388 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282390 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282393 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282399 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282402 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282404 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282407 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282410 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282413 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282415 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282418 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282420 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282423 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282425 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282429 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282432 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282435 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:08:42.285855 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282438 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282441 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282444 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282447 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282449 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282452 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282454 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282457 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282460 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282462 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282465 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282467 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282470 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282472 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282475 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282477 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282480 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282482 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282485 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282487 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:08:42.286341 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282490 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282493 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282496 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282499 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282503 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282505 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282508 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282511 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282513 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282515 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282518 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282520 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282523 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:42.282525 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.282530 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:08:42.286837 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.283385 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 21:08:42.287200 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.285805 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 21:08:42.287200 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.286850 2570 server.go:1019] "Starting client certificate rotation" Apr 22 21:08:42.287200 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.286941 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 21:08:42.287200 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.286983 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 21:08:42.313734 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.313715 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 21:08:42.319820 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.319743 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 21:08:42.334753 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.334733 2570 log.go:25] "Validated CRI v1 runtime API" Apr 22 21:08:42.341869 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.341851 2570 log.go:25] "Validated CRI v1 image API" Apr 22 21:08:42.343275 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.343259 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 21:08:42.344799 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.344779 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 21:08:42.347838 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.347817 2570 fs.go:135] Filesystem UUIDs: map[08d2c041-ef1e-4d57-97b9-7972e7b3d663:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 93233533-776d-4e06-8ed9-4990a0520e89:/dev/nvme0n1p3] Apr 22 21:08:42.347908 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.347837 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 21:08:42.353931 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.353699 2570 manager.go:217] Machine: {Timestamp:2026-04-22 21:08:42.351496767 +0000 UTC m=+0.460307671 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101361 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29c2615069a51f127fe574a9d9696a SystemUUID:ec29c261-5069-a51f-127f-e574a9d9696a BootID:4a01bd80-ddb8-4999-aab3-2cbe8c0e275b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8d:42:38:47:93 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8d:42:38:47:93 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:c1:b1:c2:ca:5c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 21:08:42.353931 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.353926 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 21:08:42.354034 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.354005 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 21:08:42.355186 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.355158 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 21:08:42.355334 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.355186 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-19.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 21:08:42.355384 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.355343 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 21:08:42.355384 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.355351 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 21:08:42.355384 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.355364 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 21:08:42.356321 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.356309 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 21:08:42.357577 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.357567 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 22 21:08:42.357680 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.357672 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 21:08:42.360459 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.360449 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 22 21:08:42.360498 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.360468 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 21:08:42.360498 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.360482 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 21:08:42.360498 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.360492 2570 kubelet.go:397] "Adding apiserver pod source" Apr 22 21:08:42.360498 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.360501 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 21:08:42.361445 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.361419 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-szkd7" Apr 22 21:08:42.361802 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.361791 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 21:08:42.361836 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.361810 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 21:08:42.365735 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.365718 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 21:08:42.369423 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.369401 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-szkd7" Apr 22 21:08:42.369545 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.369530 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 21:08:42.371996 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.371981 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 21:08:42.372093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372001 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 21:08:42.372093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372011 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 21:08:42.372093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372017 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 21:08:42.372093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372023 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 21:08:42.372093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372031 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 21:08:42.372093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372037 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 21:08:42.372093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372043 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 21:08:42.372093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372053 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 21:08:42.372093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372059 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 21:08:42.372093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372093 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 21:08:42.372495 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.372117 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 21:08:42.373145 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.373132 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 21:08:42.373180 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.373148 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 21:08:42.373513 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.373488 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-19.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 21:08:42.373587 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.373492 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 21:08:42.377403 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.377387 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 21:08:42.377479 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.377434 2570 server.go:1295] "Started kubelet" Apr 22 21:08:42.377566 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.377523 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 21:08:42.377617 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.377564 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 21:08:42.377651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.377622 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 21:08:42.378235 ip-10-0-130-19 systemd[1]: Started Kubernetes Kubelet. Apr 22 21:08:42.379615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.379599 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 21:08:42.380611 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.380596 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 22 21:08:42.387137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.387121 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 21:08:42.387698 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.387682 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 21:08:42.388281 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.388239 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 21:08:42.388381 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388309 2570 factory.go:55] Registering systemd factory Apr 22 21:08:42.388381 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388364 2570 factory.go:223] Registration of the systemd container factory successfully Apr 22 21:08:42.388565 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388537 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 21:08:42.388565 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388544 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 21:08:42.388565 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388563 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 21:08:42.388758 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388620 2570 factory.go:153] Registering CRI-O factory Apr 22 21:08:42.388758 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388635 2570 factory.go:223] Registration of the crio container factory successfully Apr 22 21:08:42.388758 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388674 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 22 21:08:42.388758 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388682 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 22 21:08:42.388758 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388705 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 21:08:42.388758 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388726 2570 factory.go:103] Registering Raw factory Apr 22 21:08:42.388758 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.388737 2570 manager.go:1196] Started watching for new ooms in manager Apr 22 21:08:42.389187 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.389152 2570 manager.go:319] Starting recovery of all containers Apr 22 21:08:42.389426 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.389403 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:42.390428 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.390406 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:42.391769 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.391751 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-19.ec2.internal" not found Apr 22 21:08:42.393216 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.393194 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-19.ec2.internal\" not found" node="ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.397549 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.397533 2570 manager.go:324] Recovery completed Apr 22 21:08:42.402797 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.402783 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:42.405286 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.405270 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:42.405354 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.405297 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:42.405354 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.405307 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:42.405794 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.405780 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 21:08:42.405794 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.405791 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 21:08:42.405891 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.405807 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 22 21:08:42.407106 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.407094 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-19.ec2.internal" not found Apr 22 21:08:42.409776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.409759 2570 policy_none.go:49] "None policy: Start" Apr 22 21:08:42.409835 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.409784 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 21:08:42.409835 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.409799 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 22 21:08:42.444850 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.444834 2570 manager.go:341] "Starting Device Plugin manager" Apr 22 21:08:42.466113 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.444871 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 21:08:42.466113 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.444884 2570 server.go:85] "Starting device plugin registration server" Apr 22 21:08:42.466113 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.445135 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 21:08:42.466113 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.445152 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 21:08:42.466113 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.445235 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 21:08:42.466113 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.445318 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 21:08:42.466113 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.445325 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 21:08:42.466113 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.445905 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 21:08:42.466113 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.445945 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:42.466530 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.466515 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-19.ec2.internal" not found Apr 22 21:08:42.483242 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.483219 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 21:08:42.484360 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.484343 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 21:08:42.484422 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.484375 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 21:08:42.484422 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.484395 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 21:08:42.484422 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.484407 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 21:08:42.484552 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.484439 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 21:08:42.486929 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.486910 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:42.545988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.545917 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:42.546808 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.546794 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:42.546870 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.546824 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:42.546870 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.546835 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:42.546870 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.546857 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.556302 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.556284 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.556378 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.556304 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-19.ec2.internal\": node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:42.573056 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.573035 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:42.585269 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.585236 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal"] Apr 22 21:08:42.585331 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.585324 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:42.586126 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.586112 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:42.586195 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.586140 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:42.586195 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.586154 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:42.588464 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.588451 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:42.588617 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.588603 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.588681 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.588632 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:42.588790 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.588775 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.588835 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.588801 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.588835 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.588820 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a5e99fc6db543cf6951686e44ee274cc-config\") pod \"kube-apiserver-proxy-ip-10-0-130-19.ec2.internal\" (UID: \"a5e99fc6db543cf6951686e44ee274cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.589161 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.589141 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:42.589161 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.589149 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:42.589345 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.589188 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:42.589345 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.589203 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:42.589345 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.589166 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:42.589345 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.589236 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:42.591386 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.591370 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.591467 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.591400 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:08:42.592037 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.592019 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:08:42.592126 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.592041 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:08:42.592126 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.592053 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:08:42.619888 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.619867 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-19.ec2.internal\" not found" node="ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.624149 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.624132 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-19.ec2.internal\" not found" node="ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.673962 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.673942 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:42.689329 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.689308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.689389 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.689339 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.689389 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.689357 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a5e99fc6db543cf6951686e44ee274cc-config\") pod \"kube-apiserver-proxy-ip-10-0-130-19.ec2.internal\" (UID: \"a5e99fc6db543cf6951686e44ee274cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.689454 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.689398 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.689454 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.689411 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a5e99fc6db543cf6951686e44ee274cc-config\") pod \"kube-apiserver-proxy-ip-10-0-130-19.ec2.internal\" (UID: \"a5e99fc6db543cf6951686e44ee274cc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.689454 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.689415 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/62d93d1676550c945f317175b90f5b9f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal\" (UID: \"62d93d1676550c945f317175b90f5b9f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.774912 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.774872 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:42.875700 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.875635 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:42.922099 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.922076 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.926637 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:42.926615 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 22 21:08:42.976199 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:42.976159 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:43.076719 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:43.076684 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:43.177236 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:43.177152 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:43.277384 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:43.277359 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:43.278022 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.277407 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:43.287507 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.287488 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 21:08:43.287611 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.287596 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:08:43.287674 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.287633 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:08:43.287674 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.287648 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:08:43.372398 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.372360 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 21:03:42 +0000 UTC" deadline="2027-09-24 20:21:01.748792011 +0000 UTC" Apr 22 21:08:43.372398 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.372393 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12479h12m18.376401859s" Apr 22 21:08:43.377457 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:43.377438 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:43.387596 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.387580 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 21:08:43.408787 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.408758 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 21:08:43.427226 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.427190 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-29fcv" Apr 22 21:08:43.434581 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.434531 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-29fcv" Apr 22 21:08:43.477956 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:43.477920 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:43.521849 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:43.521807 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d93d1676550c945f317175b90f5b9f.slice/crio-094bd9e5f9e8919e9cbc33afe3a670e040c172b6f5fe088b57c4d0b35e624d11 WatchSource:0}: Error finding container 094bd9e5f9e8919e9cbc33afe3a670e040c172b6f5fe088b57c4d0b35e624d11: Status 404 returned error can't find the container with id 094bd9e5f9e8919e9cbc33afe3a670e040c172b6f5fe088b57c4d0b35e624d11 Apr 22 21:08:43.522208 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:43.522187 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5e99fc6db543cf6951686e44ee274cc.slice/crio-3df941341ac892d1f2b2510db8a41b572b868e14d9b25320540833a9bbad8c47 WatchSource:0}: Error finding container 3df941341ac892d1f2b2510db8a41b572b868e14d9b25320540833a9bbad8c47: Status 404 returned error can't find the container with id 3df941341ac892d1f2b2510db8a41b572b868e14d9b25320540833a9bbad8c47 Apr 22 21:08:43.527049 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.527035 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:08:43.578031 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:43.578006 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-19.ec2.internal\" not found" Apr 22 21:08:43.608748 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.608725 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:43.688682 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.688603 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" Apr 22 21:08:43.702907 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.702880 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 21:08:43.703963 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.703946 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" Apr 22 21:08:43.711325 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:43.711309 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 21:08:44.321870 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.321630 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:44.361855 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.361823 2570 apiserver.go:52] "Watching apiserver" Apr 22 21:08:44.369651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.369628 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 21:08:44.369990 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.369958 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-42mgf","openshift-image-registry/node-ca-4wjxv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal","openshift-multus/multus-additional-cni-plugins-bb9x8","openshift-multus/multus-l48h6","openshift-network-diagnostics/network-check-target-7mmfr","kube-system/konnectivity-agent-fxk2q","kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c","openshift-cluster-node-tuning-operator/tuned-hdp58","openshift-multus/network-metrics-daemon-rpz8w","openshift-network-operator/iptables-alerter-rnctv"] Apr 22 21:08:44.372672 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.372647 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.374712 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.374687 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.375122 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.374975 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 21:08:44.375122 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.375103 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 21:08:44.375289 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.375156 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 21:08:44.375289 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.375164 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 21:08:44.375430 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.375309 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pm9kf\"" Apr 22 21:08:44.375430 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.375392 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 21:08:44.375430 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.375425 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 21:08:44.377340 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.376995 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 21:08:44.377340 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.377017 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.377340 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.377036 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 21:08:44.377340 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.377242 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 21:08:44.377597 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.377433 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fndn8\"" Apr 22 21:08:44.379167 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.379147 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 21:08:44.379279 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.379232 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 21:08:44.379349 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.379326 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.379470 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.379453 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 21:08:44.379702 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.379686 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m42kf\"" Apr 22 21:08:44.379772 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.379688 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 21:08:44.379828 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.379759 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 21:08:44.381467 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.381448 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 21:08:44.381655 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.381639 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tkx2v\"" Apr 22 21:08:44.384011 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.383959 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:44.384113 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:44.384054 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:08:44.384113 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.384068 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:08:44.386193 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.386160 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 21:08:44.386696 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.386620 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-kzp5w\"" Apr 22 21:08:44.386789 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.386750 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 21:08:44.386863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.386845 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.389655 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.389555 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 21:08:44.389655 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.389591 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 21:08:44.389791 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.389659 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 21:08:44.389791 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.389742 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-q7rq8\"" Apr 22 21:08:44.390203 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.390173 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.392367 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.392348 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pzc7r\"" Apr 22 21:08:44.392631 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.392616 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:44.392715 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.392630 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:08:44.392715 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.392641 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 21:08:44.392786 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:44.392712 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:08:44.394908 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.394893 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:44.397013 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.396957 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:08:44.397013 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.396969 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dj76g\"" Apr 22 21:08:44.397013 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.396997 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 21:08:44.397198 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.397059 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 21:08:44.398497 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398476 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-cnibin\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.398597 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398511 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-cni-binary-copy\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.398597 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398547 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.398597 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398572 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-var-lib-kubelet\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.398812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398598 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99gg7\" (UniqueName: \"kubernetes.io/projected/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-kube-api-access-99gg7\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.398812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398621 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-slash\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.398812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398645 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-os-release\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.398812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-socket-dir-parent\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.398812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398691 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-conf-dir\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.398812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398713 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-etc-selinux\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.398812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-sysconfig\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.398812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398768 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-systemd-units\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.398812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398798 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-run-netns\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398837 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-system-cni-dir\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-os-release\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398885 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr8nm\" (UniqueName: \"kubernetes.io/projected/32a5e549-f5ac-4611-99cf-e4b2fcd750db-kube-api-access-rr8nm\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398907 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-modprobe-d\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398957 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-sys\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.398991 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-etc-kubernetes\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399023 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e-konnectivity-ca\") pod \"konnectivity-agent-fxk2q\" (UID: \"79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e\") " pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399048 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq74q\" (UniqueName: \"kubernetes.io/projected/e5233de2-3a1e-46e4-aa55-d01d4beebd14-kube-api-access-dq74q\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399086 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-run\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399124 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/400a6d5d-3d9c-4307-9701-895aad7b37b7-serviceca\") pod \"node-ca-4wjxv\" (UID: \"400a6d5d-3d9c-4307-9701-895aad7b37b7\") " pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-device-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-kubernetes\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399210 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-sysctl-conf\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.399236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399241 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6521a5ee-a452-4638-a718-497b64cfb146-etc-tuned\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399292 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-run-ovn-kubernetes\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399330 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399358 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-cni-dir\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399385 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399407 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-host\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399435 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-run-openvswitch\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399457 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-run-ovn\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399474 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-registration-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399500 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-var-lib-cni-multus\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399516 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-hostroot\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399533 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-cnibin\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399562 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-daemon-config\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399596 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-socket-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399631 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e-agent-certs\") pod \"konnectivity-agent-fxk2q\" (UID: \"79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e\") " pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399655 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-ovnkube-script-lib\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.399958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399671 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mskg\" (UniqueName: \"kubernetes.io/projected/400a6d5d-3d9c-4307-9701-895aad7b37b7-kube-api-access-9mskg\") pod \"node-ca-4wjxv\" (UID: \"400a6d5d-3d9c-4307-9701-895aad7b37b7\") " pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399700 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-var-lib-kubelet\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399718 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6521a5ee-a452-4638-a718-497b64cfb146-tmp\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399740 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-sys-fs\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399753 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-sysctl-d\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399773 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-lib-modules\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399827 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-run-k8s-cni-cncf-io\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399857 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-run-multus-certs\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399879 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjvdb\" (UniqueName: \"kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb\") pod \"network-check-target-7mmfr\" (UID: \"caf99631-e974-40a5-90ee-50812f2ae5a4\") " pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399897 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-ovnkube-config\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399918 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-env-overrides\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399931 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-ovn-node-metrics-cert\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399943 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/400a6d5d-3d9c-4307-9701-895aad7b37b7-host\") pod \"node-ca-4wjxv\" (UID: \"400a6d5d-3d9c-4307-9701-895aad7b37b7\") " pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.399965 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-system-cni-dir\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400003 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32a5e549-f5ac-4611-99cf-e4b2fcd750db-cni-binary-copy\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400024 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqdms\" (UniqueName: \"kubernetes.io/projected/6521a5ee-a452-4638-a718-497b64cfb146-kube-api-access-rqdms\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.400776 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400045 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-kubelet\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400067 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-etc-openvswitch\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400118 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-log-socket\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400157 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-cni-netd\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kmzd\" (UniqueName: \"kubernetes.io/projected/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-kube-api-access-9kmzd\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400201 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-run-netns\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400220 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-var-lib-cni-bin\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400267 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-var-lib-openvswitch\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400285 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-cni-bin\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400312 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400339 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-run-systemd\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400368 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400399 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-systemd\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.401527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.400433 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-node-log\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.435896 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.435872 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 21:03:43 +0000 UTC" deadline="2027-11-03 22:41:16.383223836 +0000 UTC" Apr 22 21:08:44.435896 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.435894 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13441h32m31.947331652s" Apr 22 21:08:44.488698 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.488630 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" event={"ID":"a5e99fc6db543cf6951686e44ee274cc","Type":"ContainerStarted","Data":"3df941341ac892d1f2b2510db8a41b572b868e14d9b25320540833a9bbad8c47"} Apr 22 21:08:44.489220 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.489194 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 21:08:44.489705 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.489678 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" event={"ID":"62d93d1676550c945f317175b90f5b9f","Type":"ContainerStarted","Data":"094bd9e5f9e8919e9cbc33afe3a670e040c172b6f5fe088b57c4d0b35e624d11"} Apr 22 21:08:44.501060 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501039 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-daemon-config\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.501180 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-socket-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.501180 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501085 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e-agent-certs\") pod \"konnectivity-agent-fxk2q\" (UID: \"79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e\") " pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:08:44.501180 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501108 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-ovnkube-script-lib\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.501385 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501302 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mskg\" (UniqueName: \"kubernetes.io/projected/400a6d5d-3d9c-4307-9701-895aad7b37b7-kube-api-access-9mskg\") pod \"node-ca-4wjxv\" (UID: \"400a6d5d-3d9c-4307-9701-895aad7b37b7\") " pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.501385 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501366 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-var-lib-kubelet\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.501491 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501393 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6521a5ee-a452-4638-a718-497b64cfb146-tmp\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.501491 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501440 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-sys-fs\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.501491 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-sysctl-d\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.501651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-lib-modules\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.501651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501516 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-run-k8s-cni-cncf-io\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.501651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501325 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-socket-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.501651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501541 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-run-multus-certs\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.501651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvdb\" (UniqueName: \"kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb\") pod \"network-check-target-7mmfr\" (UID: \"caf99631-e974-40a5-90ee-50812f2ae5a4\") " pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:44.501651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501526 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 21:08:44.501651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501588 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-ovnkube-config\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.501651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501635 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-var-lib-kubelet\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501702 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-sys-fs\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501719 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-lib-modules\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501753 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-daemon-config\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-env-overrides\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501794 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-ovn-node-metrics-cert\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501801 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-run-k8s-cni-cncf-io\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501815 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-sysctl-d\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501819 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/400a6d5d-3d9c-4307-9701-895aad7b37b7-host\") pod \"node-ca-4wjxv\" (UID: \"400a6d5d-3d9c-4307-9701-895aad7b37b7\") " pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501830 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-run-multus-certs\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501759 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-ovnkube-script-lib\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501863 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-system-cni-dir\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501891 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32a5e549-f5ac-4611-99cf-e4b2fcd750db-cni-binary-copy\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501918 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqdms\" (UniqueName: \"kubernetes.io/projected/6521a5ee-a452-4638-a718-497b64cfb146-kube-api-access-rqdms\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-kubelet\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501967 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-etc-openvswitch\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501992 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-log-socket\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-cni-netd\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502020 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-kubelet\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502043 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kmzd\" (UniqueName: \"kubernetes.io/projected/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-kube-api-access-9kmzd\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502071 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-run-netns\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502083 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-ovnkube-config\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-var-lib-cni-bin\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501991 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/400a6d5d-3d9c-4307-9701-895aad7b37b7-host\") pod \"node-ca-4wjxv\" (UID: \"400a6d5d-3d9c-4307-9701-895aad7b37b7\") " pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.501940 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-system-cni-dir\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502120 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-var-lib-openvswitch\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502153 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-cni-bin\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502164 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-etc-openvswitch\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502179 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502201 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-log-socket\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502207 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-run-systemd\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502237 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-var-lib-openvswitch\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502237 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-env-overrides\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-var-lib-cni-bin\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.502863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502201 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-run-netns\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502290 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-run-systemd\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502300 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502303 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-cni-bin\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502322 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b522e3e-8771-4c38-8ce3-7ee8c0c5689b-host-slash\") pod \"iptables-alerter-rnctv\" (UID: \"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b\") " pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502342 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-cni-netd\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502352 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-systemd\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502375 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-node-log\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502394 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-systemd\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502400 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-cnibin\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502434 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-cni-binary-copy\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502462 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502435 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-cnibin\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-var-lib-kubelet\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502466 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-node-log\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502525 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99gg7\" (UniqueName: \"kubernetes.io/projected/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-kube-api-access-99gg7\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502552 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-slash\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.503724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-var-lib-kubelet\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502568 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/32a5e549-f5ac-4611-99cf-e4b2fcd750db-cni-binary-copy\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502583 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-os-release\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502620 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-slash\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502645 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-os-release\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502649 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-socket-dir-parent\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-conf-dir\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502699 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-etc-selinux\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502725 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-sysconfig\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502752 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-systemd-units\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502774 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-run-netns\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502777 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-conf-dir\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502795 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-etc-selinux\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502790 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502797 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-system-cni-dir\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502835 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-socket-dir-parent\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502840 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-os-release\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.504504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502841 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-systemd-units\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502842 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-run-netns\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502866 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr8nm\" (UniqueName: \"kubernetes.io/projected/32a5e549-f5ac-4611-99cf-e4b2fcd750db-kube-api-access-rr8nm\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502869 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-cni-binary-copy\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502881 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-system-cni-dir\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502893 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-modprobe-d\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502903 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-os-release\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502914 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-sysconfig\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502917 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-sys\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502954 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-etc-kubernetes\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502980 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e-konnectivity-ca\") pod \"konnectivity-agent-fxk2q\" (UID: \"79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e\") " pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502956 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-sys\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502987 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-modprobe-d\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.502982 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-etc-kubernetes\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503007 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq74q\" (UniqueName: \"kubernetes.io/projected/e5233de2-3a1e-46e4-aa55-d01d4beebd14-kube-api-access-dq74q\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503031 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-run\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/400a6d5d-3d9c-4307-9701-895aad7b37b7-serviceca\") pod \"node-ca-4wjxv\" (UID: \"400a6d5d-3d9c-4307-9701-895aad7b37b7\") " pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503083 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:44.505215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503090 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-run\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503110 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt9sq\" (UniqueName: \"kubernetes.io/projected/6b522e3e-8771-4c38-8ce3-7ee8c0c5689b-kube-api-access-kt9sq\") pod \"iptables-alerter-rnctv\" (UID: \"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b\") " pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503184 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-device-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503224 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-kubernetes\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503271 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-sysctl-conf\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503294 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6521a5ee-a452-4638-a718-497b64cfb146-etc-tuned\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-device-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503356 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-run-ovn-kubernetes\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503441 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-sysctl-conf\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503316 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-host-run-ovn-kubernetes\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503494 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503537 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfmpp\" (UniqueName: \"kubernetes.io/projected/f4126f8f-7b88-4c50-82f3-3a91a3388519-kube-api-access-qfmpp\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503576 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6b522e3e-8771-4c38-8ce3-7ee8c0c5689b-iptables-alerter-script\") pod \"iptables-alerter-rnctv\" (UID: \"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b\") " pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503499 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-etc-kubernetes\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503601 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-cni-dir\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503629 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503635 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.506068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503669 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-host\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503647 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/400a6d5d-3d9c-4307-9701-895aad7b37b7-serviceca\") pod \"node-ca-4wjxv\" (UID: \"400a6d5d-3d9c-4307-9701-895aad7b37b7\") " pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503679 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-multus-cni-dir\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-run-openvswitch\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503690 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503732 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-run-ovn\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6521a5ee-a452-4638-a718-497b64cfb146-host\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-run-openvswitch\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503771 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-registration-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503812 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-run-ovn\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503842 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-var-lib-cni-multus\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503866 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e5233de2-3a1e-46e4-aa55-d01d4beebd14-registration-dir\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503914 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-hostroot\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503937 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503953 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-host-var-lib-cni-multus\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.503995 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-hostroot\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.504024 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-cnibin\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.504126 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/32a5e549-f5ac-4611-99cf-e4b2fcd750db-cnibin\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.506914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.504330 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e-konnectivity-ca\") pod \"konnectivity-agent-fxk2q\" (UID: \"79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e\") " pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:08:44.507757 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.505389 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-ovn-node-metrics-cert\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.507757 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.505412 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e-agent-certs\") pod \"konnectivity-agent-fxk2q\" (UID: \"79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e\") " pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:08:44.507757 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.505647 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6521a5ee-a452-4638-a718-497b64cfb146-etc-tuned\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.507757 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.505870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6521a5ee-a452-4638-a718-497b64cfb146-tmp\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.514035 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:44.513949 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:44.514035 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:44.513975 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:44.514035 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:44.513991 2570 projected.go:194] Error preparing data for projected volume kube-api-access-bjvdb for pod openshift-network-diagnostics/network-check-target-7mmfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:44.514212 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:44.514058 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb podName:caf99631-e974-40a5-90ee-50812f2ae5a4 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:45.014035566 +0000 UTC m=+3.122846480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bjvdb" (UniqueName: "kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb") pod "network-check-target-7mmfr" (UID: "caf99631-e974-40a5-90ee-50812f2ae5a4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:44.516555 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.516491 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kmzd\" (UniqueName: \"kubernetes.io/projected/eb7a4eac-7e6d-40ce-abb1-594e34fb2571-kube-api-access-9kmzd\") pod \"multus-additional-cni-plugins-bb9x8\" (UID: \"eb7a4eac-7e6d-40ce-abb1-594e34fb2571\") " pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.517065 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.517042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99gg7\" (UniqueName: \"kubernetes.io/projected/d3a676b5-93c4-4a35-9feb-bcfdb41df40e-kube-api-access-99gg7\") pod \"ovnkube-node-42mgf\" (UID: \"d3a676b5-93c4-4a35-9feb-bcfdb41df40e\") " pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.517342 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.517323 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mskg\" (UniqueName: \"kubernetes.io/projected/400a6d5d-3d9c-4307-9701-895aad7b37b7-kube-api-access-9mskg\") pod \"node-ca-4wjxv\" (UID: \"400a6d5d-3d9c-4307-9701-895aad7b37b7\") " pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.517421 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.517346 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr8nm\" (UniqueName: \"kubernetes.io/projected/32a5e549-f5ac-4611-99cf-e4b2fcd750db-kube-api-access-rr8nm\") pod \"multus-l48h6\" (UID: \"32a5e549-f5ac-4611-99cf-e4b2fcd750db\") " pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.517421 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.517349 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq74q\" (UniqueName: \"kubernetes.io/projected/e5233de2-3a1e-46e4-aa55-d01d4beebd14-kube-api-access-dq74q\") pod \"aws-ebs-csi-driver-node-wqb7c\" (UID: \"e5233de2-3a1e-46e4-aa55-d01d4beebd14\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.517654 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.517637 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqdms\" (UniqueName: \"kubernetes.io/projected/6521a5ee-a452-4638-a718-497b64cfb146-kube-api-access-rqdms\") pod \"tuned-hdp58\" (UID: \"6521a5ee-a452-4638-a718-497b64cfb146\") " pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.604960 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.604928 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:44.605128 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.604968 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt9sq\" (UniqueName: \"kubernetes.io/projected/6b522e3e-8771-4c38-8ce3-7ee8c0c5689b-kube-api-access-kt9sq\") pod \"iptables-alerter-rnctv\" (UID: \"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b\") " pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:44.605128 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.605001 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfmpp\" (UniqueName: \"kubernetes.io/projected/f4126f8f-7b88-4c50-82f3-3a91a3388519-kube-api-access-qfmpp\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:44.605128 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.605045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6b522e3e-8771-4c38-8ce3-7ee8c0c5689b-iptables-alerter-script\") pod \"iptables-alerter-rnctv\" (UID: \"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b\") " pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:44.605128 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:44.605069 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:44.605394 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:44.605141 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs podName:f4126f8f-7b88-4c50-82f3-3a91a3388519 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:45.105118309 +0000 UTC m=+3.213929200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs") pod "network-metrics-daemon-rpz8w" (UID: "f4126f8f-7b88-4c50-82f3-3a91a3388519") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:44.605394 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.605165 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b522e3e-8771-4c38-8ce3-7ee8c0c5689b-host-slash\") pod \"iptables-alerter-rnctv\" (UID: \"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b\") " pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:44.605394 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.605234 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6b522e3e-8771-4c38-8ce3-7ee8c0c5689b-host-slash\") pod \"iptables-alerter-rnctv\" (UID: \"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b\") " pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:44.605669 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.605649 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6b522e3e-8771-4c38-8ce3-7ee8c0c5689b-iptables-alerter-script\") pod \"iptables-alerter-rnctv\" (UID: \"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b\") " pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:44.613478 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.613454 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt9sq\" (UniqueName: \"kubernetes.io/projected/6b522e3e-8771-4c38-8ce3-7ee8c0c5689b-kube-api-access-kt9sq\") pod \"iptables-alerter-rnctv\" (UID: \"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b\") " pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:44.614422 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.614399 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfmpp\" (UniqueName: \"kubernetes.io/projected/f4126f8f-7b88-4c50-82f3-3a91a3388519-kube-api-access-qfmpp\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:44.685218 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.685117 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:44.685218 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.685133 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:08:44.695161 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.695139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4wjxv" Apr 22 21:08:44.703194 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.703175 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" Apr 22 21:08:44.707930 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.707912 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-l48h6" Apr 22 21:08:44.714474 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.714457 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:08:44.720989 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.720971 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" Apr 22 21:08:44.727580 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.727561 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hdp58" Apr 22 21:08:44.734096 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:44.734081 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rnctv" Apr 22 21:08:45.108595 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.108513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:45.108595 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.108580 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvdb\" (UniqueName: \"kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb\") pod \"network-check-target-7mmfr\" (UID: \"caf99631-e974-40a5-90ee-50812f2ae5a4\") " pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:45.108813 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:45.108638 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:45.108813 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:45.108681 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:45.108813 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:45.108699 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:45.108813 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:45.108711 2570 projected.go:194] Error preparing data for projected volume kube-api-access-bjvdb for pod openshift-network-diagnostics/network-check-target-7mmfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:45.108813 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:45.108711 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs podName:f4126f8f-7b88-4c50-82f3-3a91a3388519 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:46.108691414 +0000 UTC m=+4.217502307 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs") pod "network-metrics-daemon-rpz8w" (UID: "f4126f8f-7b88-4c50-82f3-3a91a3388519") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:45.108813 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:45.108770 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb podName:caf99631-e974-40a5-90ee-50812f2ae5a4 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:46.108753655 +0000 UTC m=+4.217564551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-bjvdb" (UniqueName: "kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb") pod "network-check-target-7mmfr" (UID: "caf99631-e974-40a5-90ee-50812f2ae5a4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:45.198206 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:45.198171 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b522e3e_8771_4c38_8ce3_7ee8c0c5689b.slice/crio-be14075254781b8763519368a54a7358e980ee57e99c87a13bb1c78717c4dc2c WatchSource:0}: Error finding container be14075254781b8763519368a54a7358e980ee57e99c87a13bb1c78717c4dc2c: Status 404 returned error can't find the container with id be14075254781b8763519368a54a7358e980ee57e99c87a13bb1c78717c4dc2c Apr 22 21:08:45.200681 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:45.200657 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32a5e549_f5ac_4611_99cf_e4b2fcd750db.slice/crio-67a49e17876af287d1ee115b5cf24e404784217401194286a838164580a130db WatchSource:0}: Error finding container 67a49e17876af287d1ee115b5cf24e404784217401194286a838164580a130db: Status 404 returned error can't find the container with id 67a49e17876af287d1ee115b5cf24e404784217401194286a838164580a130db Apr 22 21:08:45.203444 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:45.203425 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6521a5ee_a452_4638_a718_497b64cfb146.slice/crio-5f63d07489e3e1e7f4e530a0f8786e20bf52216af27e0c24c347cab22e1e78bc WatchSource:0}: Error finding container 5f63d07489e3e1e7f4e530a0f8786e20bf52216af27e0c24c347cab22e1e78bc: Status 404 returned error can't find the container with id 5f63d07489e3e1e7f4e530a0f8786e20bf52216af27e0c24c347cab22e1e78bc Apr 22 21:08:45.205039 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:45.205017 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3a676b5_93c4_4a35_9feb_bcfdb41df40e.slice/crio-a7d3fe0458d99d27a73bad3d4c5a771c61c09cd1f15ab5ab2d4dec83518b3738 WatchSource:0}: Error finding container a7d3fe0458d99d27a73bad3d4c5a771c61c09cd1f15ab5ab2d4dec83518b3738: Status 404 returned error can't find the container with id a7d3fe0458d99d27a73bad3d4c5a771c61c09cd1f15ab5ab2d4dec83518b3738 Apr 22 21:08:45.205911 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:45.205885 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5233de2_3a1e_46e4_aa55_d01d4beebd14.slice/crio-10e28e75701bf7a9da816344a29fd3c2ca52417853d8bbb02b3984e702afabb3 WatchSource:0}: Error finding container 10e28e75701bf7a9da816344a29fd3c2ca52417853d8bbb02b3984e702afabb3: Status 404 returned error can't find the container with id 10e28e75701bf7a9da816344a29fd3c2ca52417853d8bbb02b3984e702afabb3 Apr 22 21:08:45.228779 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:45.228758 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400a6d5d_3d9c_4307_9701_895aad7b37b7.slice/crio-8e19b06ecb83c2f8bc6823bc7022b4f14805857c1f82006bdac44c3ca1d7557c WatchSource:0}: Error finding container 8e19b06ecb83c2f8bc6823bc7022b4f14805857c1f82006bdac44c3ca1d7557c: Status 404 returned error can't find the container with id 8e19b06ecb83c2f8bc6823bc7022b4f14805857c1f82006bdac44c3ca1d7557c Apr 22 21:08:45.229410 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:45.229344 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb7a4eac_7e6d_40ce_abb1_594e34fb2571.slice/crio-dedf0557f25c4b170dea2002b315d953c56fa25edead8ea2c0dfe0adfc040124 WatchSource:0}: Error finding container dedf0557f25c4b170dea2002b315d953c56fa25edead8ea2c0dfe0adfc040124: Status 404 returned error can't find the container with id dedf0557f25c4b170dea2002b315d953c56fa25edead8ea2c0dfe0adfc040124 Apr 22 21:08:45.230072 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:08:45.230046 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e5edd2_847e_4f99_9bd9_f7ba3a94cd4e.slice/crio-02884464a9e930784c8d34fb4852477d181e3f1b0a355b581ec76564f291c2dd WatchSource:0}: Error finding container 02884464a9e930784c8d34fb4852477d181e3f1b0a355b581ec76564f291c2dd: Status 404 returned error can't find the container with id 02884464a9e930784c8d34fb4852477d181e3f1b0a355b581ec76564f291c2dd Apr 22 21:08:45.436779 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.436701 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 21:03:43 +0000 UTC" deadline="2028-01-24 13:14:40.138872804 +0000 UTC" Apr 22 21:08:45.436779 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.436732 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15400h5m54.702143246s" Apr 22 21:08:45.492179 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.492136 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hdp58" event={"ID":"6521a5ee-a452-4638-a718-497b64cfb146","Type":"ContainerStarted","Data":"5f63d07489e3e1e7f4e530a0f8786e20bf52216af27e0c24c347cab22e1e78bc"} Apr 22 21:08:45.493236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.493205 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-l48h6" event={"ID":"32a5e549-f5ac-4611-99cf-e4b2fcd750db","Type":"ContainerStarted","Data":"67a49e17876af287d1ee115b5cf24e404784217401194286a838164580a130db"} Apr 22 21:08:45.494179 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.494157 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rnctv" event={"ID":"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b","Type":"ContainerStarted","Data":"be14075254781b8763519368a54a7358e980ee57e99c87a13bb1c78717c4dc2c"} Apr 22 21:08:45.495117 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.495090 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fxk2q" event={"ID":"79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e","Type":"ContainerStarted","Data":"02884464a9e930784c8d34fb4852477d181e3f1b0a355b581ec76564f291c2dd"} Apr 22 21:08:45.496057 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.496035 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4wjxv" event={"ID":"400a6d5d-3d9c-4307-9701-895aad7b37b7","Type":"ContainerStarted","Data":"8e19b06ecb83c2f8bc6823bc7022b4f14805857c1f82006bdac44c3ca1d7557c"} Apr 22 21:08:45.497544 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.497525 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" event={"ID":"a5e99fc6db543cf6951686e44ee274cc","Type":"ContainerStarted","Data":"61885c04ee8a148cb20b2adf41cb62b3c712f1e35541d2b9baf539c9dee5d555"} Apr 22 21:08:45.498532 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.498500 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" event={"ID":"eb7a4eac-7e6d-40ce-abb1-594e34fb2571","Type":"ContainerStarted","Data":"dedf0557f25c4b170dea2002b315d953c56fa25edead8ea2c0dfe0adfc040124"} Apr 22 21:08:45.499371 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.499346 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" event={"ID":"e5233de2-3a1e-46e4-aa55-d01d4beebd14","Type":"ContainerStarted","Data":"10e28e75701bf7a9da816344a29fd3c2ca52417853d8bbb02b3984e702afabb3"} Apr 22 21:08:45.500189 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.500170 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" event={"ID":"d3a676b5-93c4-4a35-9feb-bcfdb41df40e","Type":"ContainerStarted","Data":"a7d3fe0458d99d27a73bad3d4c5a771c61c09cd1f15ab5ab2d4dec83518b3738"} Apr 22 21:08:45.508675 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:45.508638 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-19.ec2.internal" podStartSLOduration=2.508627984 podStartE2EDuration="2.508627984s" podCreationTimestamp="2026-04-22 21:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:08:45.508529173 +0000 UTC m=+3.617340085" watchObservedRunningTime="2026-04-22 21:08:45.508627984 +0000 UTC m=+3.617438897" Apr 22 21:08:46.119763 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:46.118838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvdb\" (UniqueName: \"kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb\") pod \"network-check-target-7mmfr\" (UID: \"caf99631-e974-40a5-90ee-50812f2ae5a4\") " pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:46.119763 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:46.118904 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:46.119763 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:46.119121 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:46.119763 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:46.119225 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs podName:f4126f8f-7b88-4c50-82f3-3a91a3388519 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:48.119204599 +0000 UTC m=+6.228015510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs") pod "network-metrics-daemon-rpz8w" (UID: "f4126f8f-7b88-4c50-82f3-3a91a3388519") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:46.119763 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:46.119343 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:46.119763 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:46.119356 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:46.119763 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:46.119369 2570 projected.go:194] Error preparing data for projected volume kube-api-access-bjvdb for pod openshift-network-diagnostics/network-check-target-7mmfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:46.119763 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:46.119408 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb podName:caf99631-e974-40a5-90ee-50812f2ae5a4 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:48.119395668 +0000 UTC m=+6.228206565 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-bjvdb" (UniqueName: "kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb") pod "network-check-target-7mmfr" (UID: "caf99631-e974-40a5-90ee-50812f2ae5a4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:46.411751 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:46.411458 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:08:46.487192 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:46.487157 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:46.487643 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:46.487326 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:08:46.487757 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:46.487738 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:46.487849 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:46.487830 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:08:46.516485 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:46.516420 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" event={"ID":"62d93d1676550c945f317175b90f5b9f","Type":"ContainerStarted","Data":"163961b2ef4a92d46533e11b53ebc129a288e66752bc066a9247b70770498bdf"} Apr 22 21:08:47.522025 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:47.521987 2570 generic.go:358] "Generic (PLEG): container finished" podID="62d93d1676550c945f317175b90f5b9f" containerID="163961b2ef4a92d46533e11b53ebc129a288e66752bc066a9247b70770498bdf" exitCode=0 Apr 22 21:08:47.522500 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:47.522037 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" event={"ID":"62d93d1676550c945f317175b90f5b9f","Type":"ContainerDied","Data":"163961b2ef4a92d46533e11b53ebc129a288e66752bc066a9247b70770498bdf"} Apr 22 21:08:48.134934 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:48.134849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:48.134934 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:48.134915 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvdb\" (UniqueName: \"kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb\") pod \"network-check-target-7mmfr\" (UID: \"caf99631-e974-40a5-90ee-50812f2ae5a4\") " pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:48.135156 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:48.135020 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:48.135156 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:48.135050 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:48.135156 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:48.135066 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:48.135156 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:48.135080 2570 projected.go:194] Error preparing data for projected volume kube-api-access-bjvdb for pod openshift-network-diagnostics/network-check-target-7mmfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:48.135156 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:48.135092 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs podName:f4126f8f-7b88-4c50-82f3-3a91a3388519 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:52.135072069 +0000 UTC m=+10.243882963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs") pod "network-metrics-daemon-rpz8w" (UID: "f4126f8f-7b88-4c50-82f3-3a91a3388519") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:48.135156 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:48.135137 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb podName:caf99631-e974-40a5-90ee-50812f2ae5a4 nodeName:}" failed. No retries permitted until 2026-04-22 21:08:52.135121915 +0000 UTC m=+10.243932823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-bjvdb" (UniqueName: "kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb") pod "network-check-target-7mmfr" (UID: "caf99631-e974-40a5-90ee-50812f2ae5a4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:48.488518 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:48.487681 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:48.488518 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:48.487805 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:08:48.488518 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:48.487915 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:48.488518 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:48.488073 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:08:50.485711 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:50.485669 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:50.486153 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:50.485795 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:08:50.486153 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:50.485993 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:50.486153 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:50.486091 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:08:52.165052 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:52.164986 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvdb\" (UniqueName: \"kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb\") pod \"network-check-target-7mmfr\" (UID: \"caf99631-e974-40a5-90ee-50812f2ae5a4\") " pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:52.165542 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:52.165076 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:52.165542 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:52.165140 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:08:52.165542 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:52.165167 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:08:52.165542 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:52.165182 2570 projected.go:194] Error preparing data for projected volume kube-api-access-bjvdb for pod openshift-network-diagnostics/network-check-target-7mmfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:52.165542 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:52.165201 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:52.165542 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:52.165263 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb podName:caf99631-e974-40a5-90ee-50812f2ae5a4 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:00.165231199 +0000 UTC m=+18.274042092 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-bjvdb" (UniqueName: "kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb") pod "network-check-target-7mmfr" (UID: "caf99631-e974-40a5-90ee-50812f2ae5a4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:08:52.165542 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:52.165342 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs podName:f4126f8f-7b88-4c50-82f3-3a91a3388519 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:00.165317053 +0000 UTC m=+18.274127961 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs") pod "network-metrics-daemon-rpz8w" (UID: "f4126f8f-7b88-4c50-82f3-3a91a3388519") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:08:52.486067 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:52.485980 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:52.486225 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:52.486097 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:08:52.486225 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:52.486132 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:52.486225 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:52.486198 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:08:54.484836 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:54.484799 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:54.485290 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:54.484803 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:54.485290 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:54.484919 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:08:54.485290 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:54.485020 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:08:56.485353 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:56.485322 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:56.485353 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:56.485337 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:56.485807 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:56.485447 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:08:56.485807 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:56.485574 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:08:58.485428 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:58.485396 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:08:58.485428 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:08:58.485415 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:08:58.485944 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:58.485507 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:08:58.485944 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:08:58.485662 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:09:00.222120 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:00.222087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:00.222590 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:00.222132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvdb\" (UniqueName: \"kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb\") pod \"network-check-target-7mmfr\" (UID: \"caf99631-e974-40a5-90ee-50812f2ae5a4\") " pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:00.222590 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:00.222234 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:00.222590 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:00.222262 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:00.222590 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:00.222277 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:00.222590 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:00.222289 2570 projected.go:194] Error preparing data for projected volume kube-api-access-bjvdb for pod openshift-network-diagnostics/network-check-target-7mmfr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:00.222590 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:00.222314 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs podName:f4126f8f-7b88-4c50-82f3-3a91a3388519 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.222295657 +0000 UTC m=+34.331106566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs") pod "network-metrics-daemon-rpz8w" (UID: "f4126f8f-7b88-4c50-82f3-3a91a3388519") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:00.222590 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:00.222332 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb podName:caf99631-e974-40a5-90ee-50812f2ae5a4 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.222323325 +0000 UTC m=+34.331134228 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-bjvdb" (UniqueName: "kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb") pod "network-check-target-7mmfr" (UID: "caf99631-e974-40a5-90ee-50812f2ae5a4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:00.485036 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:00.484940 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:00.485036 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:00.484962 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:00.485237 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:00.485086 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:09:00.485237 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:00.485211 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:09:01.475466 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.475433 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5tcwn"] Apr 22 21:09:01.505481 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.505445 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.507835 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.507814 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 21:09:01.507960 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.507930 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 21:09:01.508992 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.508972 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-q7c24\"" Apr 22 21:09:01.630093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.630060 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8b4v\" (UniqueName: \"kubernetes.io/projected/ac2a064a-e64c-4f46-aa0e-e1056872e044-kube-api-access-f8b4v\") pod \"node-resolver-5tcwn\" (UID: \"ac2a064a-e64c-4f46-aa0e-e1056872e044\") " pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.630222 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.630158 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac2a064a-e64c-4f46-aa0e-e1056872e044-tmp-dir\") pod \"node-resolver-5tcwn\" (UID: \"ac2a064a-e64c-4f46-aa0e-e1056872e044\") " pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.630222 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.630186 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ac2a064a-e64c-4f46-aa0e-e1056872e044-hosts-file\") pod \"node-resolver-5tcwn\" (UID: \"ac2a064a-e64c-4f46-aa0e-e1056872e044\") " pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.730602 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.730530 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8b4v\" (UniqueName: \"kubernetes.io/projected/ac2a064a-e64c-4f46-aa0e-e1056872e044-kube-api-access-f8b4v\") pod \"node-resolver-5tcwn\" (UID: \"ac2a064a-e64c-4f46-aa0e-e1056872e044\") " pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.730602 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.730594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac2a064a-e64c-4f46-aa0e-e1056872e044-tmp-dir\") pod \"node-resolver-5tcwn\" (UID: \"ac2a064a-e64c-4f46-aa0e-e1056872e044\") " pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.730812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.730612 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ac2a064a-e64c-4f46-aa0e-e1056872e044-hosts-file\") pod \"node-resolver-5tcwn\" (UID: \"ac2a064a-e64c-4f46-aa0e-e1056872e044\") " pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.730812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.730677 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ac2a064a-e64c-4f46-aa0e-e1056872e044-hosts-file\") pod \"node-resolver-5tcwn\" (UID: \"ac2a064a-e64c-4f46-aa0e-e1056872e044\") " pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.730949 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.730930 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ac2a064a-e64c-4f46-aa0e-e1056872e044-tmp-dir\") pod \"node-resolver-5tcwn\" (UID: \"ac2a064a-e64c-4f46-aa0e-e1056872e044\") " pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.738997 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.738969 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8b4v\" (UniqueName: \"kubernetes.io/projected/ac2a064a-e64c-4f46-aa0e-e1056872e044-kube-api-access-f8b4v\") pod \"node-resolver-5tcwn\" (UID: \"ac2a064a-e64c-4f46-aa0e-e1056872e044\") " pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.815414 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:01.815386 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5tcwn" Apr 22 21:09:01.937596 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:01.937573 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac2a064a_e64c_4f46_aa0e_e1056872e044.slice/crio-71672a556e2a17e9664d6be35d824001b982d4a272b487b1f9606660c8ecd792 WatchSource:0}: Error finding container 71672a556e2a17e9664d6be35d824001b982d4a272b487b1f9606660c8ecd792: Status 404 returned error can't find the container with id 71672a556e2a17e9664d6be35d824001b982d4a272b487b1f9606660c8ecd792 Apr 22 21:09:02.485874 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.485845 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:02.486520 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:02.485950 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:09:02.486520 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.486035 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:02.486520 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:02.486131 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:09:02.553484 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.553446 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-l48h6" event={"ID":"32a5e549-f5ac-4611-99cf-e4b2fcd750db","Type":"ContainerStarted","Data":"e70b510f21a6270b6955091571d9c51127e7a7038554c613702e6f5885ee45b8"} Apr 22 21:09:02.555343 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.555306 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5tcwn" event={"ID":"ac2a064a-e64c-4f46-aa0e-e1056872e044","Type":"ContainerStarted","Data":"37009002e8b67709f6be1a01c494f06ad12913784d59feb8ca7d995f5ac92c10"} Apr 22 21:09:02.555343 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.555342 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5tcwn" event={"ID":"ac2a064a-e64c-4f46-aa0e-e1056872e044","Type":"ContainerStarted","Data":"71672a556e2a17e9664d6be35d824001b982d4a272b487b1f9606660c8ecd792"} Apr 22 21:09:02.557407 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.557380 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" event={"ID":"62d93d1676550c945f317175b90f5b9f","Type":"ContainerStarted","Data":"28318332f672481c40d99451c7d5403db0371d04554b8ad9232400eda22a6603"} Apr 22 21:09:02.559366 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.559321 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fxk2q" event={"ID":"79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e","Type":"ContainerStarted","Data":"61110edf866b73e89f696d31a8e7850aec37ae3289621f19919db5597e00e9f7"} Apr 22 21:09:02.561182 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.561152 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4wjxv" event={"ID":"400a6d5d-3d9c-4307-9701-895aad7b37b7","Type":"ContainerStarted","Data":"b383581c41ca50f3858416fc32826d54a7258a9ca2fce368be247cf917e194c6"} Apr 22 21:09:02.563271 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.563144 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" event={"ID":"eb7a4eac-7e6d-40ce-abb1-594e34fb2571","Type":"ContainerStarted","Data":"22d04d8713de6d72ced83571567e47738bd6b8f24e6c5b21f0eba824ccbb7bfe"} Apr 22 21:09:02.565519 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.565497 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" event={"ID":"e5233de2-3a1e-46e4-aa55-d01d4beebd14","Type":"ContainerStarted","Data":"c0ba5817d5362520d3ad11833df5920ecb0ce06404ae1aa1966669e141904017"} Apr 22 21:09:02.571708 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.571007 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:09:02.571708 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.571411 2570 generic.go:358] "Generic (PLEG): container finished" podID="d3a676b5-93c4-4a35-9feb-bcfdb41df40e" containerID="898bb701fef5bdde8d0e9212282a0588721d11fb5c352b71fa5089b93643799c" exitCode=1 Apr 22 21:09:02.571708 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.571475 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" event={"ID":"d3a676b5-93c4-4a35-9feb-bcfdb41df40e","Type":"ContainerStarted","Data":"d73090087c00a2d6f3bf0d9fb81997e6a3e2b07e6a13de853a0943dc05a1d649"} Apr 22 21:09:02.571708 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.571500 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" event={"ID":"d3a676b5-93c4-4a35-9feb-bcfdb41df40e","Type":"ContainerStarted","Data":"a26b041564849dcf0cf150741917bf9431977941d93684350167ad36a0215e47"} Apr 22 21:09:02.571708 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.571513 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" event={"ID":"d3a676b5-93c4-4a35-9feb-bcfdb41df40e","Type":"ContainerStarted","Data":"9784301244a9cc74ec164ca7357140b9f7bff41fdc0ca37796a2c4a11a03d77a"} Apr 22 21:09:02.571708 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.571525 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" event={"ID":"d3a676b5-93c4-4a35-9feb-bcfdb41df40e","Type":"ContainerStarted","Data":"dc32ed465b1c80d13c55ee9a39c8c5e64e13c68d1529d4003988def308dad8e8"} Apr 22 21:09:02.571708 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.571544 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" event={"ID":"d3a676b5-93c4-4a35-9feb-bcfdb41df40e","Type":"ContainerDied","Data":"898bb701fef5bdde8d0e9212282a0588721d11fb5c352b71fa5089b93643799c"} Apr 22 21:09:02.571708 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.571561 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" event={"ID":"d3a676b5-93c4-4a35-9feb-bcfdb41df40e","Type":"ContainerStarted","Data":"7e382803763f729f443cf8e27e316299e6d24722e43edb9037910aaf27d773c3"} Apr 22 21:09:02.574272 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.573506 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hdp58" event={"ID":"6521a5ee-a452-4638-a718-497b64cfb146","Type":"ContainerStarted","Data":"f5678ae15ba0d2e15feef3b26431558799a21c31dec14f7285642a88014790dd"} Apr 22 21:09:02.579106 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.579043 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-l48h6" podStartSLOduration=3.759039759 podStartE2EDuration="20.579028104s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:08:45.202653867 +0000 UTC m=+3.311464770" lastFinishedPulling="2026-04-22 21:09:02.02264222 +0000 UTC m=+20.131453115" observedRunningTime="2026-04-22 21:09:02.578006186 +0000 UTC m=+20.686817100" watchObservedRunningTime="2026-04-22 21:09:02.579028104 +0000 UTC m=+20.687839018" Apr 22 21:09:02.592243 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.592188 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4wjxv" podStartSLOduration=3.9087388880000002 podStartE2EDuration="20.592172425s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:08:45.232317522 +0000 UTC m=+3.341128413" lastFinishedPulling="2026-04-22 21:09:01.915751049 +0000 UTC m=+20.024561950" observedRunningTime="2026-04-22 21:09:02.592038814 +0000 UTC m=+20.700849749" watchObservedRunningTime="2026-04-22 21:09:02.592172425 +0000 UTC m=+20.700983340" Apr 22 21:09:02.607065 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.607013 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hdp58" podStartSLOduration=3.88415315 podStartE2EDuration="20.60699559s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:08:45.20500763 +0000 UTC m=+3.313818522" lastFinishedPulling="2026-04-22 21:09:01.927850068 +0000 UTC m=+20.036660962" observedRunningTime="2026-04-22 21:09:02.606705558 +0000 UTC m=+20.715516463" watchObservedRunningTime="2026-04-22 21:09:02.60699559 +0000 UTC m=+20.715806516" Apr 22 21:09:02.643265 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.643193 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fxk2q" podStartSLOduration=3.986226462 podStartE2EDuration="20.643179104s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:08:45.232218038 +0000 UTC m=+3.341028932" lastFinishedPulling="2026-04-22 21:09:01.889170669 +0000 UTC m=+19.997981574" observedRunningTime="2026-04-22 21:09:02.642457541 +0000 UTC m=+20.751268458" watchObservedRunningTime="2026-04-22 21:09:02.643179104 +0000 UTC m=+20.751990016" Apr 22 21:09:02.656544 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.656498 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5tcwn" podStartSLOduration=1.656483445 podStartE2EDuration="1.656483445s" podCreationTimestamp="2026-04-22 21:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:02.655906491 +0000 UTC m=+20.764717405" watchObservedRunningTime="2026-04-22 21:09:02.656483445 +0000 UTC m=+20.765294359" Apr 22 21:09:02.673224 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:02.673180 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-19.ec2.internal" podStartSLOduration=19.673164536 podStartE2EDuration="19.673164536s" podCreationTimestamp="2026-04-22 21:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:02.67267975 +0000 UTC m=+20.781490663" watchObservedRunningTime="2026-04-22 21:09:02.673164536 +0000 UTC m=+20.781975451" Apr 22 21:09:03.095876 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.095749 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 21:09:03.456056 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.455955 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T21:09:03.095873739Z","UUID":"fa7f0be4-83bf-4193-8bdd-ae1032355bac","Handler":null,"Name":"","Endpoint":""} Apr 22 21:09:03.457729 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.457697 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 21:09:03.457729 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.457731 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 21:09:03.576339 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.576298 2570 generic.go:358] "Generic (PLEG): container finished" podID="eb7a4eac-7e6d-40ce-abb1-594e34fb2571" containerID="22d04d8713de6d72ced83571567e47738bd6b8f24e6c5b21f0eba824ccbb7bfe" exitCode=0 Apr 22 21:09:03.576970 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.576369 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" event={"ID":"eb7a4eac-7e6d-40ce-abb1-594e34fb2571","Type":"ContainerDied","Data":"22d04d8713de6d72ced83571567e47738bd6b8f24e6c5b21f0eba824ccbb7bfe"} Apr 22 21:09:03.578222 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.578195 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" event={"ID":"e5233de2-3a1e-46e4-aa55-d01d4beebd14","Type":"ContainerStarted","Data":"7fe7b35314b7025d7fd81ca43e0e5c59ffdc84c1d4ca5914058047205220d510"} Apr 22 21:09:03.579964 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.579543 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rnctv" event={"ID":"6b522e3e-8771-4c38-8ce3-7ee8c0c5689b","Type":"ContainerStarted","Data":"2fa0b4b383ed4af5b4cf8afbf9278a80cfb5e10e17f6f3ec875364b369f21200"} Apr 22 21:09:03.760315 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.760229 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:09:03.760945 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.760925 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:09:03.773410 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:03.773366 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rnctv" podStartSLOduration=5.085449854 podStartE2EDuration="21.773352193s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:08:45.201269637 +0000 UTC m=+3.310080533" lastFinishedPulling="2026-04-22 21:09:01.889171975 +0000 UTC m=+19.997982872" observedRunningTime="2026-04-22 21:09:03.606344292 +0000 UTC m=+21.715155204" watchObservedRunningTime="2026-04-22 21:09:03.773352193 +0000 UTC m=+21.882163108" Apr 22 21:09:04.488700 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:04.488670 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:04.488947 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:04.488776 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:09:04.488947 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:04.488812 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:04.488947 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:04.488896 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:09:04.583506 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:04.583474 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" event={"ID":"e5233de2-3a1e-46e4-aa55-d01d4beebd14","Type":"ContainerStarted","Data":"250f4077ea70ebb115fb6bc13d8c2ec656d74837155a86c8c769d9ef7ef01731"} Apr 22 21:09:04.584117 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:04.584083 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:09:04.584634 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:04.584585 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fxk2q" Apr 22 21:09:04.598417 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:04.598376 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wqb7c" podStartSLOduration=4.01121196 podStartE2EDuration="22.598363025s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:08:45.226959828 +0000 UTC m=+3.335770733" lastFinishedPulling="2026-04-22 21:09:03.814110907 +0000 UTC m=+21.922921798" observedRunningTime="2026-04-22 21:09:04.59820448 +0000 UTC m=+22.707015394" watchObservedRunningTime="2026-04-22 21:09:04.598363025 +0000 UTC m=+22.707173939" Apr 22 21:09:05.588989 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:05.588953 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:09:05.589585 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:05.589310 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" event={"ID":"d3a676b5-93c4-4a35-9feb-bcfdb41df40e","Type":"ContainerStarted","Data":"88e0bdca33154c49969be5bdb06ad47a3e978f8510269f983d0905056cd3a0ae"} Apr 22 21:09:06.488225 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:06.488199 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:06.488225 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:06.488220 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:06.488471 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:06.488323 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:09:06.488591 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:06.488485 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:09:07.595182 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:07.595159 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:09:07.595663 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:07.595526 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" event={"ID":"d3a676b5-93c4-4a35-9feb-bcfdb41df40e","Type":"ContainerStarted","Data":"ae9177b5f95108c9974d962389f8d5abf519ae3418c7e24ba2f4ec59a90cdc3b"} Apr 22 21:09:07.595934 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:07.595912 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:09:07.596000 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:07.595945 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:09:07.596048 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:07.596026 2570 scope.go:117] "RemoveContainer" containerID="898bb701fef5bdde8d0e9212282a0588721d11fb5c352b71fa5089b93643799c" Apr 22 21:09:07.610635 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:07.610612 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:09:08.485398 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:08.485367 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:08.485547 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:08.485369 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:08.485547 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:08.485470 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:09:08.485547 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:08.485537 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:09:08.598161 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:08.598126 2570 generic.go:358] "Generic (PLEG): container finished" podID="eb7a4eac-7e6d-40ce-abb1-594e34fb2571" containerID="eb5961a7167f159b70435e1798d2ad82201da59be561ea50a48a52fc221277dd" exitCode=0 Apr 22 21:09:08.598610 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:08.598209 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" event={"ID":"eb7a4eac-7e6d-40ce-abb1-594e34fb2571","Type":"ContainerDied","Data":"eb5961a7167f159b70435e1798d2ad82201da59be561ea50a48a52fc221277dd"} Apr 22 21:09:08.601374 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:08.601357 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:09:08.601641 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:08.601622 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" event={"ID":"d3a676b5-93c4-4a35-9feb-bcfdb41df40e","Type":"ContainerStarted","Data":"a1cf0f479fbbba163c31d8ecc87928a1bdf64471315d0c4a570e8c594c348218"} Apr 22 21:09:08.601989 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:08.601977 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:09:08.615970 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:08.615951 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:09:09.414948 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:09.414724 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" podStartSLOduration=10.654331941 podStartE2EDuration="27.414707281s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:08:45.227032008 +0000 UTC m=+3.335842913" lastFinishedPulling="2026-04-22 21:09:01.987407359 +0000 UTC m=+20.096218253" observedRunningTime="2026-04-22 21:09:08.644326763 +0000 UTC m=+26.753137678" watchObservedRunningTime="2026-04-22 21:09:09.414707281 +0000 UTC m=+27.523518194" Apr 22 21:09:09.415603 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:09.415578 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7mmfr"] Apr 22 21:09:09.415724 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:09.415705 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:09.415839 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:09.415809 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:09:09.417673 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:09.417648 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rpz8w"] Apr 22 21:09:09.417766 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:09.417759 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:09.417879 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:09.417856 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:09:09.605297 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:09.605244 2570 generic.go:358] "Generic (PLEG): container finished" podID="eb7a4eac-7e6d-40ce-abb1-594e34fb2571" containerID="db9b1206565c5be8e85c0b0b89537e945004743bec26629a743ac4cd6926de90" exitCode=0 Apr 22 21:09:09.605875 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:09.605345 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" event={"ID":"eb7a4eac-7e6d-40ce-abb1-594e34fb2571","Type":"ContainerDied","Data":"db9b1206565c5be8e85c0b0b89537e945004743bec26629a743ac4cd6926de90"} Apr 22 21:09:10.608545 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:10.608444 2570 generic.go:358] "Generic (PLEG): container finished" podID="eb7a4eac-7e6d-40ce-abb1-594e34fb2571" containerID="6754ddeb12fbbf65796b6b8ce8cdf55528335de7ac74afdf7bbfc6ba175a5428" exitCode=0 Apr 22 21:09:10.608545 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:10.608507 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" event={"ID":"eb7a4eac-7e6d-40ce-abb1-594e34fb2571","Type":"ContainerDied","Data":"6754ddeb12fbbf65796b6b8ce8cdf55528335de7ac74afdf7bbfc6ba175a5428"} Apr 22 21:09:11.485080 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:11.485047 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:11.485224 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:11.485099 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:11.485224 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:11.485181 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:09:11.485407 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:11.485330 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:09:13.135195 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.135162 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-jbwgj"] Apr 22 21:09:13.139171 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.139149 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.139317 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:13.139221 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jbwgj" podUID="467caf5c-14f4-4489-a131-5028add687dc" Apr 22 21:09:13.146955 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.146929 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jbwgj"] Apr 22 21:09:13.215832 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.215799 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/467caf5c-14f4-4489-a131-5028add687dc-kubelet-config\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.215994 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.215860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.215994 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.215948 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/467caf5c-14f4-4489-a131-5028add687dc-dbus\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.316322 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.316293 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.316499 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.316338 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/467caf5c-14f4-4489-a131-5028add687dc-dbus\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.316499 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.316392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/467caf5c-14f4-4489-a131-5028add687dc-kubelet-config\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.316499 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.316473 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/467caf5c-14f4-4489-a131-5028add687dc-kubelet-config\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.316499 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:13.316488 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:13.316650 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:13.316568 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret podName:467caf5c-14f4-4489-a131-5028add687dc nodeName:}" failed. No retries permitted until 2026-04-22 21:09:13.816546804 +0000 UTC m=+31.925357700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret") pod "global-pull-secret-syncer-jbwgj" (UID: "467caf5c-14f4-4489-a131-5028add687dc") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:13.316650 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.316595 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/467caf5c-14f4-4489-a131-5028add687dc-dbus\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.485546 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.485467 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:13.485708 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.485471 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:13.485708 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:13.485596 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7mmfr" podUID="caf99631-e974-40a5-90ee-50812f2ae5a4" Apr 22 21:09:13.485708 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:13.485692 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpz8w" podUID="f4126f8f-7b88-4c50-82f3-3a91a3388519" Apr 22 21:09:13.615009 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.614977 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.615167 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:13.615080 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-jbwgj" podUID="467caf5c-14f4-4489-a131-5028add687dc" Apr 22 21:09:13.820032 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:13.819933 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:13.820165 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:13.820100 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:13.820208 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:13.820187 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret podName:467caf5c-14f4-4489-a131-5028add687dc nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.820153932 +0000 UTC m=+32.928964839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret") pod "global-pull-secret-syncer-jbwgj" (UID: "467caf5c-14f4-4489-a131-5028add687dc") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:14.826972 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:14.826727 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:14.827431 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:14.826905 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:14.827431 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:14.827104 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret podName:467caf5c-14f4-4489-a131-5028add687dc nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.827088009 +0000 UTC m=+34.935898903 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret") pod "global-pull-secret-syncer-jbwgj" (UID: "467caf5c-14f4-4489-a131-5028add687dc") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:15.148239 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.148164 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-19.ec2.internal" event="NodeReady" Apr 22 21:09:15.148430 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.148330 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 21:09:15.178727 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.178628 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6854cd699f-kt8sj"] Apr 22 21:09:15.206868 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.206836 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5nj7r"] Apr 22 21:09:15.207040 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.206993 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.209519 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.209395 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 21:09:15.209519 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.209401 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 21:09:15.209519 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.209416 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jsc9g\"" Apr 22 21:09:15.209519 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.209430 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 21:09:15.214638 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.214617 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 21:09:15.230467 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.230446 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2zb4s"] Apr 22 21:09:15.230616 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.230596 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.232977 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.232957 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 21:09:15.233106 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.232962 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 21:09:15.233260 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.233234 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ll98\"" Apr 22 21:09:15.249065 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.249042 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6854cd699f-kt8sj"] Apr 22 21:09:15.249065 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.249064 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5nj7r"] Apr 22 21:09:15.249228 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.249073 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2zb4s"] Apr 22 21:09:15.249228 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.249179 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:15.251516 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.251497 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 21:09:15.251666 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.251597 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-c86tt\"" Apr 22 21:09:15.251725 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.251701 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 21:09:15.251798 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.251783 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 21:09:15.331209 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331169 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-certificates\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.331209 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331211 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4vm2\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-kube-api-access-g4vm2\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.331445 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331227 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdjw9\" (UniqueName: \"kubernetes.io/projected/070e50f3-495a-4586-b0bd-a251eb98bccc-kube-api-access-sdjw9\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:15.331445 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331257 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-trusted-ca\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.331445 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331379 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-image-registry-private-configuration\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.331445 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331413 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-ca-trust-extracted\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.331445 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331434 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-bound-sa-token\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.331651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331476 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bcsx\" (UniqueName: \"kubernetes.io/projected/570e8677-7a14-41e1-af96-2344f7ef5d3a-kube-api-access-6bcsx\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.331651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331533 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-installation-pull-secrets\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.331651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:15.331651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331566 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/570e8677-7a14-41e1-af96-2344f7ef5d3a-config-volume\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.331651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331587 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.331651 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331624 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.331917 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.331654 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/570e8677-7a14-41e1-af96-2344f7ef5d3a-tmp-dir\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.432178 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432090 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-image-registry-private-configuration\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.432178 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432130 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-ca-trust-extracted\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.432178 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432157 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-bound-sa-token\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.432460 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432188 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bcsx\" (UniqueName: \"kubernetes.io/projected/570e8677-7a14-41e1-af96-2344f7ef5d3a-kube-api-access-6bcsx\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.432460 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432212 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-installation-pull-secrets\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.432460 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:15.432460 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432243 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/570e8677-7a14-41e1-af96-2344f7ef5d3a-config-volume\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.432656 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432471 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.432656 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.432656 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432549 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/570e8677-7a14-41e1-af96-2344f7ef5d3a-tmp-dir\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.432656 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432567 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-ca-trust-extracted\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.432656 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432628 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-certificates\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.432656 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.432648 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:15.432969 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432660 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4vm2\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-kube-api-access-g4vm2\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.432969 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432685 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdjw9\" (UniqueName: \"kubernetes.io/projected/070e50f3-495a-4586-b0bd-a251eb98bccc-kube-api-access-sdjw9\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:15.432969 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.432697 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls podName:570e8677-7a14-41e1-af96-2344f7ef5d3a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:15.932681502 +0000 UTC m=+34.041492393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls") pod "dns-default-5nj7r" (UID: "570e8677-7a14-41e1-af96-2344f7ef5d3a") : secret "dns-default-metrics-tls" not found Apr 22 21:09:15.432969 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-trusted-ca\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.432969 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/570e8677-7a14-41e1-af96-2344f7ef5d3a-config-volume\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.432969 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.432837 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/570e8677-7a14-41e1-af96-2344f7ef5d3a-tmp-dir\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.432969 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.432909 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:15.432969 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.432923 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6854cd699f-kt8sj: secret "image-registry-tls" not found Apr 22 21:09:15.432969 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.432970 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls podName:c437a3d4-bcaa-4353-b17a-d8d4f6753b20 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:15.932953767 +0000 UTC m=+34.041764662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls") pod "image-registry-6854cd699f-kt8sj" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20") : secret "image-registry-tls" not found Apr 22 21:09:15.433402 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.433075 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:15.433402 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.433116 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert podName:070e50f3-495a-4586-b0bd-a251eb98bccc nodeName:}" failed. No retries permitted until 2026-04-22 21:09:15.933102733 +0000 UTC m=+34.041913641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert") pod "ingress-canary-2zb4s" (UID: "070e50f3-495a-4586-b0bd-a251eb98bccc") : secret "canary-serving-cert" not found Apr 22 21:09:15.433480 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.433465 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-certificates\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.436883 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.436861 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-image-registry-private-configuration\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.436988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.436861 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-installation-pull-secrets\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.441325 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.441292 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-trusted-ca\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.442056 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.442038 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bcsx\" (UniqueName: \"kubernetes.io/projected/570e8677-7a14-41e1-af96-2344f7ef5d3a-kube-api-access-6bcsx\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.442453 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.442396 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4vm2\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-kube-api-access-g4vm2\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.442551 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.442540 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-bound-sa-token\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.449922 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.449896 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdjw9\" (UniqueName: \"kubernetes.io/projected/070e50f3-495a-4586-b0bd-a251eb98bccc-kube-api-access-sdjw9\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:15.485141 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.485111 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:15.485294 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.485114 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:15.485406 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.485114 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:15.488309 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.488289 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 21:09:15.488415 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.488335 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 21:09:15.488415 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.488381 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bm97r\"" Apr 22 21:09:15.488514 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.488450 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 21:09:15.488593 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.488577 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ccfht\"" Apr 22 21:09:15.488663 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.488606 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 21:09:15.937891 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.937849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:15.938381 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.937905 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:15.938381 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:15.937937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:15.938381 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.938026 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:15.938381 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.938053 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:15.938381 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.938075 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:15.938381 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.938088 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6854cd699f-kt8sj: secret "image-registry-tls" not found Apr 22 21:09:15.938381 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.938108 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert podName:070e50f3-495a-4586-b0bd-a251eb98bccc nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.938086696 +0000 UTC m=+35.046897610 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert") pod "ingress-canary-2zb4s" (UID: "070e50f3-495a-4586-b0bd-a251eb98bccc") : secret "canary-serving-cert" not found Apr 22 21:09:15.938381 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.938123 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls podName:570e8677-7a14-41e1-af96-2344f7ef5d3a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.938116845 +0000 UTC m=+35.046927735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls") pod "dns-default-5nj7r" (UID: "570e8677-7a14-41e1-af96-2344f7ef5d3a") : secret "dns-default-metrics-tls" not found Apr 22 21:09:15.938381 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:15.938135 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls podName:c437a3d4-bcaa-4353-b17a-d8d4f6753b20 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.938128247 +0000 UTC m=+35.046939141 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls") pod "image-registry-6854cd699f-kt8sj" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20") : secret "image-registry-tls" not found Apr 22 21:09:16.240773 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.240702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:16.241007 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.240777 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvdb\" (UniqueName: \"kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb\") pod \"network-check-target-7mmfr\" (UID: \"caf99631-e974-40a5-90ee-50812f2ae5a4\") " pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:16.241007 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:16.240880 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 21:09:16.241007 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:16.240965 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs podName:f4126f8f-7b88-4c50-82f3-3a91a3388519 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:48.240941077 +0000 UTC m=+66.349751984 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs") pod "network-metrics-daemon-rpz8w" (UID: "f4126f8f-7b88-4c50-82f3-3a91a3388519") : secret "metrics-daemon-secret" not found Apr 22 21:09:16.243135 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.243117 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjvdb\" (UniqueName: \"kubernetes.io/projected/caf99631-e974-40a5-90ee-50812f2ae5a4-kube-api-access-bjvdb\") pod \"network-check-target-7mmfr\" (UID: \"caf99631-e974-40a5-90ee-50812f2ae5a4\") " pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:16.396734 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.396700 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:16.566010 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.565981 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7mmfr"] Apr 22 21:09:16.587190 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:16.587161 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaf99631_e974_40a5_90ee_50812f2ae5a4.slice/crio-a36b8f85ceb56841c8558c3b9c5adf80e3b2bc3dca6fe21b75e0e343e193e8b4 WatchSource:0}: Error finding container a36b8f85ceb56841c8558c3b9c5adf80e3b2bc3dca6fe21b75e0e343e193e8b4: Status 404 returned error can't find the container with id a36b8f85ceb56841c8558c3b9c5adf80e3b2bc3dca6fe21b75e0e343e193e8b4 Apr 22 21:09:16.622556 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.622519 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7mmfr" event={"ID":"caf99631-e974-40a5-90ee-50812f2ae5a4","Type":"ContainerStarted","Data":"a36b8f85ceb56841c8558c3b9c5adf80e3b2bc3dca6fe21b75e0e343e193e8b4"} Apr 22 21:09:16.844316 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.844095 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:16.847682 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.847651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/467caf5c-14f4-4489-a131-5028add687dc-original-pull-secret\") pod \"global-pull-secret-syncer-jbwgj\" (UID: \"467caf5c-14f4-4489-a131-5028add687dc\") " pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:16.945429 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.945399 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:16.945747 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.945433 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:16.945747 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.945455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:16.945747 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:16.945531 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:16.945747 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:16.945551 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:16.945747 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:16.945589 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert podName:070e50f3-495a-4586-b0bd-a251eb98bccc nodeName:}" failed. No retries permitted until 2026-04-22 21:09:18.945574145 +0000 UTC m=+37.054385036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert") pod "ingress-canary-2zb4s" (UID: "070e50f3-495a-4586-b0bd-a251eb98bccc") : secret "canary-serving-cert" not found Apr 22 21:09:16.945747 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:16.945555 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:16.945747 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:16.945602 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6854cd699f-kt8sj: secret "image-registry-tls" not found Apr 22 21:09:16.945747 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:16.945604 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls podName:570e8677-7a14-41e1-af96-2344f7ef5d3a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:18.945597584 +0000 UTC m=+37.054408474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls") pod "dns-default-5nj7r" (UID: "570e8677-7a14-41e1-af96-2344f7ef5d3a") : secret "dns-default-metrics-tls" not found Apr 22 21:09:16.945747 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:16.945633 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls podName:c437a3d4-bcaa-4353-b17a-d8d4f6753b20 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:18.945616478 +0000 UTC m=+37.054427372 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls") pod "image-registry-6854cd699f-kt8sj" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20") : secret "image-registry-tls" not found Apr 22 21:09:16.963921 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.963705 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846"] Apr 22 21:09:16.980314 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.980290 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk"] Apr 22 21:09:16.980450 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.980431 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" Apr 22 21:09:16.983079 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.983047 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-4pj7r\"" Apr 22 21:09:16.983079 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.983066 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 21:09:16.983325 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.983124 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 21:09:16.983325 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.983165 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 21:09:16.983449 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.983403 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 21:09:16.997911 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.997889 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846"] Apr 22 21:09:16.997911 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.997913 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk"] Apr 22 21:09:16.998031 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:16.997992 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.000056 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.000040 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 21:09:17.000367 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.000348 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 21:09:17.000426 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.000403 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 21:09:17.000634 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.000614 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 21:09:17.004589 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.004346 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-jbwgj" Apr 22 21:09:17.046788 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.046704 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b00bf3c8-0330-4689-b9db-9998bbcf018b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7888d68599-qp846\" (UID: \"b00bf3c8-0330-4689-b9db-9998bbcf018b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" Apr 22 21:09:17.046788 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.046760 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhf4\" (UniqueName: \"kubernetes.io/projected/b00bf3c8-0330-4689-b9db-9998bbcf018b-kube-api-access-dlhf4\") pod \"managed-serviceaccount-addon-agent-7888d68599-qp846\" (UID: \"b00bf3c8-0330-4689-b9db-9998bbcf018b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" Apr 22 21:09:17.128780 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.128747 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-jbwgj"] Apr 22 21:09:17.132926 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:17.132898 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod467caf5c_14f4_4489_a131_5028add687dc.slice/crio-ae3743555dc067408beb79ab710f01a0158fb0e5a15dd5822fc30d0a644e590f WatchSource:0}: Error finding container ae3743555dc067408beb79ab710f01a0158fb0e5a15dd5822fc30d0a644e590f: Status 404 returned error can't find the container with id ae3743555dc067408beb79ab710f01a0158fb0e5a15dd5822fc30d0a644e590f Apr 22 21:09:17.147509 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.147489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhf4\" (UniqueName: \"kubernetes.io/projected/b00bf3c8-0330-4689-b9db-9998bbcf018b-kube-api-access-dlhf4\") pod \"managed-serviceaccount-addon-agent-7888d68599-qp846\" (UID: \"b00bf3c8-0330-4689-b9db-9998bbcf018b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" Apr 22 21:09:17.147658 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.147556 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/70973331-d0fd-42a7-81d2-3009076d2a1f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.147658 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.147648 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-hub\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.147818 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.147715 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b00bf3c8-0330-4689-b9db-9998bbcf018b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7888d68599-qp846\" (UID: \"b00bf3c8-0330-4689-b9db-9998bbcf018b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" Apr 22 21:09:17.147818 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.147747 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-ca\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.147818 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.147776 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29m24\" (UniqueName: \"kubernetes.io/projected/70973331-d0fd-42a7-81d2-3009076d2a1f-kube-api-access-29m24\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.147818 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.147804 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.148017 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.147855 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.150874 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.150836 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b00bf3c8-0330-4689-b9db-9998bbcf018b-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7888d68599-qp846\" (UID: \"b00bf3c8-0330-4689-b9db-9998bbcf018b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" Apr 22 21:09:17.154423 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.154402 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhf4\" (UniqueName: \"kubernetes.io/projected/b00bf3c8-0330-4689-b9db-9998bbcf018b-kube-api-access-dlhf4\") pod \"managed-serviceaccount-addon-agent-7888d68599-qp846\" (UID: \"b00bf3c8-0330-4689-b9db-9998bbcf018b\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" Apr 22 21:09:17.248555 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.248518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-ca\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.248726 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.248565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29m24\" (UniqueName: \"kubernetes.io/projected/70973331-d0fd-42a7-81d2-3009076d2a1f-kube-api-access-29m24\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.248726 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.248594 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.248833 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.248760 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.248890 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.248850 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/70973331-d0fd-42a7-81d2-3009076d2a1f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.248943 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.248919 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-hub\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.249672 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.249641 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/70973331-d0fd-42a7-81d2-3009076d2a1f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.251497 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.251476 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-ca\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.251628 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.251537 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.251628 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.251602 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.251742 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.251669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/70973331-d0fd-42a7-81d2-3009076d2a1f-hub\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.255880 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.255860 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29m24\" (UniqueName: \"kubernetes.io/projected/70973331-d0fd-42a7-81d2-3009076d2a1f-kube-api-access-29m24\") pod \"cluster-proxy-proxy-agent-5f54c74c8c-vmvsk\" (UID: \"70973331-d0fd-42a7-81d2-3009076d2a1f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.299484 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.299409 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" Apr 22 21:09:17.309164 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.309139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:17.459867 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.459819 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846"] Apr 22 21:09:17.464000 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.463975 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk"] Apr 22 21:09:17.470505 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:17.470330 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb00bf3c8_0330_4689_b9db_9998bbcf018b.slice/crio-fbb06558c1d3f4ed60dc93a5b9f1390550946213c6c76f8df5e2369737f08ee5 WatchSource:0}: Error finding container fbb06558c1d3f4ed60dc93a5b9f1390550946213c6c76f8df5e2369737f08ee5: Status 404 returned error can't find the container with id fbb06558c1d3f4ed60dc93a5b9f1390550946213c6c76f8df5e2369737f08ee5 Apr 22 21:09:17.471322 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:17.471240 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70973331_d0fd_42a7_81d2_3009076d2a1f.slice/crio-69d17019e6998d81fa46f9fd6fde2f690741cbcaf10024c70a2e0d7cdcedf669 WatchSource:0}: Error finding container 69d17019e6998d81fa46f9fd6fde2f690741cbcaf10024c70a2e0d7cdcedf669: Status 404 returned error can't find the container with id 69d17019e6998d81fa46f9fd6fde2f690741cbcaf10024c70a2e0d7cdcedf669 Apr 22 21:09:17.626051 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.625957 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" event={"ID":"70973331-d0fd-42a7-81d2-3009076d2a1f","Type":"ContainerStarted","Data":"69d17019e6998d81fa46f9fd6fde2f690741cbcaf10024c70a2e0d7cdcedf669"} Apr 22 21:09:17.627146 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.627106 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" event={"ID":"b00bf3c8-0330-4689-b9db-9998bbcf018b","Type":"ContainerStarted","Data":"fbb06558c1d3f4ed60dc93a5b9f1390550946213c6c76f8df5e2369737f08ee5"} Apr 22 21:09:17.628154 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.628121 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jbwgj" event={"ID":"467caf5c-14f4-4489-a131-5028add687dc","Type":"ContainerStarted","Data":"ae3743555dc067408beb79ab710f01a0158fb0e5a15dd5822fc30d0a644e590f"} Apr 22 21:09:17.630843 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.630816 2570 generic.go:358] "Generic (PLEG): container finished" podID="eb7a4eac-7e6d-40ce-abb1-594e34fb2571" containerID="b6c41686a20851884f372416ecd08ecf5b2358b5b9307f003ba1d232b6ded7fb" exitCode=0 Apr 22 21:09:17.630948 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:17.630853 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" event={"ID":"eb7a4eac-7e6d-40ce-abb1-594e34fb2571","Type":"ContainerDied","Data":"b6c41686a20851884f372416ecd08ecf5b2358b5b9307f003ba1d232b6ded7fb"} Apr 22 21:09:18.638132 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:18.637496 2570 generic.go:358] "Generic (PLEG): container finished" podID="eb7a4eac-7e6d-40ce-abb1-594e34fb2571" containerID="bdb3ea27894341819a886b7acdaa79a28cc5b721e5b9af82ce8ad41adf1043bc" exitCode=0 Apr 22 21:09:18.638132 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:18.637561 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" event={"ID":"eb7a4eac-7e6d-40ce-abb1-594e34fb2571","Type":"ContainerDied","Data":"bdb3ea27894341819a886b7acdaa79a28cc5b721e5b9af82ce8ad41adf1043bc"} Apr 22 21:09:18.963552 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:18.962565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:18.963552 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:18.962614 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:18.963552 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:18.962646 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:18.963552 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:18.962836 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:18.963552 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:18.962854 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6854cd699f-kt8sj: secret "image-registry-tls" not found Apr 22 21:09:18.963552 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:18.962915 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls podName:c437a3d4-bcaa-4353-b17a-d8d4f6753b20 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:22.962894099 +0000 UTC m=+41.071705006 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls") pod "image-registry-6854cd699f-kt8sj" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20") : secret "image-registry-tls" not found Apr 22 21:09:18.963552 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:18.963354 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:18.963552 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:18.963405 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert podName:070e50f3-495a-4586-b0bd-a251eb98bccc nodeName:}" failed. No retries permitted until 2026-04-22 21:09:22.963389684 +0000 UTC m=+41.072200593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert") pod "ingress-canary-2zb4s" (UID: "070e50f3-495a-4586-b0bd-a251eb98bccc") : secret "canary-serving-cert" not found Apr 22 21:09:18.963552 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:18.963474 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:18.963552 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:18.963504 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls podName:570e8677-7a14-41e1-af96-2344f7ef5d3a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:22.963494236 +0000 UTC m=+41.072305127 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls") pod "dns-default-5nj7r" (UID: "570e8677-7a14-41e1-af96-2344f7ef5d3a") : secret "dns-default-metrics-tls" not found Apr 22 21:09:19.646001 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:19.645399 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" event={"ID":"eb7a4eac-7e6d-40ce-abb1-594e34fb2571","Type":"ContainerStarted","Data":"2713fe79cfd2db2eb3761fc3cc3910b6ee3f92cb257dcd9ec6dc90309e40c8be"} Apr 22 21:09:19.672198 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:19.670452 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bb9x8" podStartSLOduration=6.273433715 podStartE2EDuration="37.67043493s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:08:45.2324133 +0000 UTC m=+3.341224190" lastFinishedPulling="2026-04-22 21:09:16.629414501 +0000 UTC m=+34.738225405" observedRunningTime="2026-04-22 21:09:19.669437519 +0000 UTC m=+37.778248433" watchObservedRunningTime="2026-04-22 21:09:19.67043493 +0000 UTC m=+37.779245841" Apr 22 21:09:22.998702 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:22.998671 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:22.998702 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:22.998705 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:22.999120 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:22.998724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:22.999120 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:22.998827 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:22.999120 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:22.998894 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert podName:070e50f3-495a-4586-b0bd-a251eb98bccc nodeName:}" failed. No retries permitted until 2026-04-22 21:09:30.998874505 +0000 UTC m=+49.107685415 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert") pod "ingress-canary-2zb4s" (UID: "070e50f3-495a-4586-b0bd-a251eb98bccc") : secret "canary-serving-cert" not found Apr 22 21:09:22.999120 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:22.998833 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:22.999120 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:22.998968 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls podName:570e8677-7a14-41e1-af96-2344f7ef5d3a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:30.998957259 +0000 UTC m=+49.107768150 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls") pod "dns-default-5nj7r" (UID: "570e8677-7a14-41e1-af96-2344f7ef5d3a") : secret "dns-default-metrics-tls" not found Apr 22 21:09:22.999120 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:22.998835 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:22.999120 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:22.998983 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6854cd699f-kt8sj: secret "image-registry-tls" not found Apr 22 21:09:22.999120 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:22.999005 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls podName:c437a3d4-bcaa-4353-b17a-d8d4f6753b20 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:30.998996309 +0000 UTC m=+49.107807202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls") pod "image-registry-6854cd699f-kt8sj" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20") : secret "image-registry-tls" not found Apr 22 21:09:25.502921 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.502889 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm"] Apr 22 21:09:25.541814 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.541785 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm"] Apr 22 21:09:25.541952 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.541902 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm" Apr 22 21:09:25.544659 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.544637 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 21:09:25.544779 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.544639 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:25.545627 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.545609 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-rmr8z\"" Apr 22 21:09:25.621883 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.621851 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhjq8\" (UniqueName: \"kubernetes.io/projected/32a4c04c-bb6b-4ecb-8790-7acaad9a70f1-kube-api-access-nhjq8\") pod \"migrator-74bb7799d9-6mgmm\" (UID: \"32a4c04c-bb6b-4ecb-8790-7acaad9a70f1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm" Apr 22 21:09:25.658639 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.658612 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7mmfr" event={"ID":"caf99631-e974-40a5-90ee-50812f2ae5a4","Type":"ContainerStarted","Data":"1b007dbfe5c1c6ab12ba16aa7c7600ce66f5bd0a64ef5bd06167878ddf968528"} Apr 22 21:09:25.658771 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.658703 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:25.659861 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.659837 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" event={"ID":"b00bf3c8-0330-4689-b9db-9998bbcf018b","Type":"ContainerStarted","Data":"2e383790cd95adfece0304d2fe34486f2bc9689e137b5bcea082198d915f6025"} Apr 22 21:09:25.661009 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.660990 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-jbwgj" event={"ID":"467caf5c-14f4-4489-a131-5028add687dc","Type":"ContainerStarted","Data":"e98c093485b9194c460556ede97a32ef73ef806c545b5555732d7ed845b433a4"} Apr 22 21:09:25.662125 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.662107 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" event={"ID":"70973331-d0fd-42a7-81d2-3009076d2a1f","Type":"ContainerStarted","Data":"940e4ee011dd264619833c5dfadf75f5c96c5cbbf1faa9ac25994a03d3d8caaf"} Apr 22 21:09:25.680346 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.680304 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7mmfr" podStartSLOduration=35.800683459 podStartE2EDuration="43.680293664s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:09:16.606405653 +0000 UTC m=+34.715216548" lastFinishedPulling="2026-04-22 21:09:24.486015861 +0000 UTC m=+42.594826753" observedRunningTime="2026-04-22 21:09:25.679299343 +0000 UTC m=+43.788110256" watchObservedRunningTime="2026-04-22 21:09:25.680293664 +0000 UTC m=+43.789104577" Apr 22 21:09:25.697834 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.697797 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7888d68599-qp846" podStartSLOduration=2.660922018 podStartE2EDuration="9.697787059s" podCreationTimestamp="2026-04-22 21:09:16 +0000 UTC" firstStartedPulling="2026-04-22 21:09:17.472695152 +0000 UTC m=+35.581506043" lastFinishedPulling="2026-04-22 21:09:24.509560179 +0000 UTC m=+42.618371084" observedRunningTime="2026-04-22 21:09:25.697111297 +0000 UTC m=+43.805922210" watchObservedRunningTime="2026-04-22 21:09:25.697787059 +0000 UTC m=+43.806597975" Apr 22 21:09:25.718483 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.718439 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-jbwgj" podStartSLOduration=5.378390509 podStartE2EDuration="12.718427146s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:17.145980169 +0000 UTC m=+35.254791066" lastFinishedPulling="2026-04-22 21:09:24.486016807 +0000 UTC m=+42.594827703" observedRunningTime="2026-04-22 21:09:25.717802829 +0000 UTC m=+43.826613743" watchObservedRunningTime="2026-04-22 21:09:25.718427146 +0000 UTC m=+43.827238059" Apr 22 21:09:25.723107 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.723089 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhjq8\" (UniqueName: \"kubernetes.io/projected/32a4c04c-bb6b-4ecb-8790-7acaad9a70f1-kube-api-access-nhjq8\") pod \"migrator-74bb7799d9-6mgmm\" (UID: \"32a4c04c-bb6b-4ecb-8790-7acaad9a70f1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm" Apr 22 21:09:25.733744 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.733718 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhjq8\" (UniqueName: \"kubernetes.io/projected/32a4c04c-bb6b-4ecb-8790-7acaad9a70f1-kube-api-access-nhjq8\") pod \"migrator-74bb7799d9-6mgmm\" (UID: \"32a4c04c-bb6b-4ecb-8790-7acaad9a70f1\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm" Apr 22 21:09:25.850667 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.850598 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm" Apr 22 21:09:25.978416 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:25.978382 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm"] Apr 22 21:09:25.981335 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:25.981305 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32a4c04c_bb6b_4ecb_8790_7acaad9a70f1.slice/crio-443a64a416d7492ec28ffe81aca8c99ab34bab8c4e33c3f5acc2d7924d232160 WatchSource:0}: Error finding container 443a64a416d7492ec28ffe81aca8c99ab34bab8c4e33c3f5acc2d7924d232160: Status 404 returned error can't find the container with id 443a64a416d7492ec28ffe81aca8c99ab34bab8c4e33c3f5acc2d7924d232160 Apr 22 21:09:26.240902 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:26.240874 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5tcwn_ac2a064a-e64c-4f46-aa0e-e1056872e044/dns-node-resolver/0.log" Apr 22 21:09:26.665137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:26.665058 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm" event={"ID":"32a4c04c-bb6b-4ecb-8790-7acaad9a70f1","Type":"ContainerStarted","Data":"443a64a416d7492ec28ffe81aca8c99ab34bab8c4e33c3f5acc2d7924d232160"} Apr 22 21:09:27.241513 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:27.241483 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4wjxv_400a6d5d-3d9c-4307-9701-895aad7b37b7/node-ca/0.log" Apr 22 21:09:28.673801 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:28.673720 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" event={"ID":"70973331-d0fd-42a7-81d2-3009076d2a1f","Type":"ContainerStarted","Data":"c22c1865ee37c3242e39045ab9a847be308c4eb5f1e2e0bdc926187375ca1dbe"} Apr 22 21:09:28.673801 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:28.673760 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" event={"ID":"70973331-d0fd-42a7-81d2-3009076d2a1f","Type":"ContainerStarted","Data":"5629391fc7e8da4f995b4e3244926692bf8eda386a4823e363fe5fcd9801014f"} Apr 22 21:09:28.675351 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:28.675326 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm" event={"ID":"32a4c04c-bb6b-4ecb-8790-7acaad9a70f1","Type":"ContainerStarted","Data":"aa40b7256237badd95ae3b04c73f8c90248b582a5067102f9bd5e388a039b1e9"} Apr 22 21:09:28.675465 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:28.675358 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm" event={"ID":"32a4c04c-bb6b-4ecb-8790-7acaad9a70f1","Type":"ContainerStarted","Data":"a846068a6b60650ebbf8ddff891919c216129f5110915e52b60c5193ca4803d9"} Apr 22 21:09:28.690683 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:28.690642 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" podStartSLOduration=2.203483561 podStartE2EDuration="12.690628316s" podCreationTimestamp="2026-04-22 21:09:16 +0000 UTC" firstStartedPulling="2026-04-22 21:09:17.473469406 +0000 UTC m=+35.582280300" lastFinishedPulling="2026-04-22 21:09:27.960614152 +0000 UTC m=+46.069425055" observedRunningTime="2026-04-22 21:09:28.690167851 +0000 UTC m=+46.798978764" watchObservedRunningTime="2026-04-22 21:09:28.690628316 +0000 UTC m=+46.799439228" Apr 22 21:09:28.703683 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:28.703642 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-6mgmm" podStartSLOduration=1.723834916 podStartE2EDuration="3.703632302s" podCreationTimestamp="2026-04-22 21:09:25 +0000 UTC" firstStartedPulling="2026-04-22 21:09:25.983434751 +0000 UTC m=+44.092245643" lastFinishedPulling="2026-04-22 21:09:27.963232138 +0000 UTC m=+46.072043029" observedRunningTime="2026-04-22 21:09:28.702871122 +0000 UTC m=+46.811682029" watchObservedRunningTime="2026-04-22 21:09:28.703632302 +0000 UTC m=+46.812443265" Apr 22 21:09:31.064199 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:31.064157 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:31.064199 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:31.064197 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:31.064618 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:31.064215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:31.064618 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:31.064343 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:31.064618 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:31.064355 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6854cd699f-kt8sj: secret "image-registry-tls" not found Apr 22 21:09:31.064618 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:31.064354 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:31.064618 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:31.064397 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls podName:c437a3d4-bcaa-4353-b17a-d8d4f6753b20 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:47.064383887 +0000 UTC m=+65.173194778 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls") pod "image-registry-6854cd699f-kt8sj" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20") : secret "image-registry-tls" not found Apr 22 21:09:31.064618 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:31.064422 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls podName:570e8677-7a14-41e1-af96-2344f7ef5d3a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:47.064408189 +0000 UTC m=+65.173219080 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls") pod "dns-default-5nj7r" (UID: "570e8677-7a14-41e1-af96-2344f7ef5d3a") : secret "dns-default-metrics-tls" not found Apr 22 21:09:31.064618 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:31.064362 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:31.064618 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:09:31.064451 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert podName:070e50f3-495a-4586-b0bd-a251eb98bccc nodeName:}" failed. No retries permitted until 2026-04-22 21:09:47.064442718 +0000 UTC m=+65.173253609 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert") pod "ingress-canary-2zb4s" (UID: "070e50f3-495a-4586-b0bd-a251eb98bccc") : secret "canary-serving-cert" not found Apr 22 21:09:37.311782 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:37.311719 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" podUID="70973331-d0fd-42a7-81d2-3009076d2a1f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 21:09:40.624665 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:40.624638 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42mgf" Apr 22 21:09:47.079384 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.079342 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:47.079384 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.079383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:47.079828 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.079401 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:47.081847 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.081821 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/570e8677-7a14-41e1-af96-2344f7ef5d3a-metrics-tls\") pod \"dns-default-5nj7r\" (UID: \"570e8677-7a14-41e1-af96-2344f7ef5d3a\") " pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:47.081931 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.081865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/070e50f3-495a-4586-b0bd-a251eb98bccc-cert\") pod \"ingress-canary-2zb4s\" (UID: \"070e50f3-495a-4586-b0bd-a251eb98bccc\") " pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:47.081931 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.081835 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") pod \"image-registry-6854cd699f-kt8sj\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:47.310891 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.310848 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" podUID="70973331-d0fd-42a7-81d2-3009076d2a1f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 21:09:47.320407 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.320384 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jsc9g\"" Apr 22 21:09:47.329116 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.329094 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:47.347194 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.347174 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ll98\"" Apr 22 21:09:47.354915 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.354888 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:47.361537 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.361515 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-c86tt\"" Apr 22 21:09:47.369994 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.369967 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2zb4s" Apr 22 21:09:47.475105 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.475075 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6854cd699f-kt8sj"] Apr 22 21:09:47.478179 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:47.478152 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc437a3d4_bcaa_4353_b17a_d8d4f6753b20.slice/crio-2be1a10414b14adcf0d8fce7c450e648127f231fe238022e540bedd23f15d7dd WatchSource:0}: Error finding container 2be1a10414b14adcf0d8fce7c450e648127f231fe238022e540bedd23f15d7dd: Status 404 returned error can't find the container with id 2be1a10414b14adcf0d8fce7c450e648127f231fe238022e540bedd23f15d7dd Apr 22 21:09:47.484888 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.484866 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5nj7r"] Apr 22 21:09:47.487769 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:47.487744 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod570e8677_7a14_41e1_af96_2344f7ef5d3a.slice/crio-18fa74f719443cb09262d56a16843b6fa6e4be899dd623c0c64f7e0e5f80531a WatchSource:0}: Error finding container 18fa74f719443cb09262d56a16843b6fa6e4be899dd623c0c64f7e0e5f80531a: Status 404 returned error can't find the container with id 18fa74f719443cb09262d56a16843b6fa6e4be899dd623c0c64f7e0e5f80531a Apr 22 21:09:47.498381 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.498359 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2zb4s"] Apr 22 21:09:47.501966 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:47.501942 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod070e50f3_495a_4586_b0bd_a251eb98bccc.slice/crio-6523e145b6e4e4b3962a17ccbae48057bfab0dd084d9e40a1c3c9ec9ea385a41 WatchSource:0}: Error finding container 6523e145b6e4e4b3962a17ccbae48057bfab0dd084d9e40a1c3c9ec9ea385a41: Status 404 returned error can't find the container with id 6523e145b6e4e4b3962a17ccbae48057bfab0dd084d9e40a1c3c9ec9ea385a41 Apr 22 21:09:47.730906 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.730870 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2zb4s" event={"ID":"070e50f3-495a-4586-b0bd-a251eb98bccc","Type":"ContainerStarted","Data":"6523e145b6e4e4b3962a17ccbae48057bfab0dd084d9e40a1c3c9ec9ea385a41"} Apr 22 21:09:47.731940 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.731910 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5nj7r" event={"ID":"570e8677-7a14-41e1-af96-2344f7ef5d3a","Type":"ContainerStarted","Data":"18fa74f719443cb09262d56a16843b6fa6e4be899dd623c0c64f7e0e5f80531a"} Apr 22 21:09:47.733145 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.733120 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" event={"ID":"c437a3d4-bcaa-4353-b17a-d8d4f6753b20","Type":"ContainerStarted","Data":"b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e"} Apr 22 21:09:47.733226 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.733152 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" event={"ID":"c437a3d4-bcaa-4353-b17a-d8d4f6753b20","Type":"ContainerStarted","Data":"2be1a10414b14adcf0d8fce7c450e648127f231fe238022e540bedd23f15d7dd"} Apr 22 21:09:47.733286 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.733275 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:09:47.751001 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:47.750952 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" podStartSLOduration=57.75093563 podStartE2EDuration="57.75093563s" podCreationTimestamp="2026-04-22 21:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:47.749847283 +0000 UTC m=+65.858658214" watchObservedRunningTime="2026-04-22 21:09:47.75093563 +0000 UTC m=+65.859746546" Apr 22 21:09:48.288145 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:48.288104 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:48.291031 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:48.291003 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f4126f8f-7b88-4c50-82f3-3a91a3388519-metrics-certs\") pod \"network-metrics-daemon-rpz8w\" (UID: \"f4126f8f-7b88-4c50-82f3-3a91a3388519\") " pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:48.513100 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:48.513070 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ccfht\"" Apr 22 21:09:48.521009 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:48.520980 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpz8w" Apr 22 21:09:48.648686 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:48.648655 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rpz8w"] Apr 22 21:09:48.986671 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:48.986641 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4126f8f_7b88_4c50_82f3_3a91a3388519.slice/crio-91f5158181d81540107ab898f86d1219280da22ea300c40225981e496c048aae WatchSource:0}: Error finding container 91f5158181d81540107ab898f86d1219280da22ea300c40225981e496c048aae: Status 404 returned error can't find the container with id 91f5158181d81540107ab898f86d1219280da22ea300c40225981e496c048aae Apr 22 21:09:49.738946 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:49.738911 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rpz8w" event={"ID":"f4126f8f-7b88-4c50-82f3-3a91a3388519","Type":"ContainerStarted","Data":"91f5158181d81540107ab898f86d1219280da22ea300c40225981e496c048aae"} Apr 22 21:09:50.743181 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:50.743103 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5nj7r" event={"ID":"570e8677-7a14-41e1-af96-2344f7ef5d3a","Type":"ContainerStarted","Data":"cd05c3915f3169f997b07475b63c037adc63f35a8a99285628e0574ff68f5465"} Apr 22 21:09:50.743181 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:50.743138 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5nj7r" event={"ID":"570e8677-7a14-41e1-af96-2344f7ef5d3a","Type":"ContainerStarted","Data":"d7c84d3e11a560bb76e89ab03dda0247b5ac37cfb479a1e9a0acf4d117d2a0e6"} Apr 22 21:09:50.743680 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:50.743222 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5nj7r" Apr 22 21:09:50.744352 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:50.744330 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2zb4s" event={"ID":"070e50f3-495a-4586-b0bd-a251eb98bccc","Type":"ContainerStarted","Data":"917276ac6c0e7ca9078b039c6c4676a901d58d464037ec3246450637f8673149"} Apr 22 21:09:50.763001 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:50.762957 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5nj7r" podStartSLOduration=33.322739313 podStartE2EDuration="35.762944971s" podCreationTimestamp="2026-04-22 21:09:15 +0000 UTC" firstStartedPulling="2026-04-22 21:09:47.489543884 +0000 UTC m=+65.598354774" lastFinishedPulling="2026-04-22 21:09:49.929749524 +0000 UTC m=+68.038560432" observedRunningTime="2026-04-22 21:09:50.761824213 +0000 UTC m=+68.870635127" watchObservedRunningTime="2026-04-22 21:09:50.762944971 +0000 UTC m=+68.871755949" Apr 22 21:09:50.776061 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:50.776021 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2zb4s" podStartSLOduration=33.343013165 podStartE2EDuration="35.776010295s" podCreationTimestamp="2026-04-22 21:09:15 +0000 UTC" firstStartedPulling="2026-04-22 21:09:47.503756594 +0000 UTC m=+65.612567484" lastFinishedPulling="2026-04-22 21:09:49.936753715 +0000 UTC m=+68.045564614" observedRunningTime="2026-04-22 21:09:50.775295525 +0000 UTC m=+68.884106436" watchObservedRunningTime="2026-04-22 21:09:50.776010295 +0000 UTC m=+68.884821208" Apr 22 21:09:51.749048 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:51.749009 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rpz8w" event={"ID":"f4126f8f-7b88-4c50-82f3-3a91a3388519","Type":"ContainerStarted","Data":"aa7fe09032bf95c2af093412bb5612892bcf77750ab81f0fe412e520d79f5ba0"} Apr 22 21:09:51.749048 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:51.749051 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rpz8w" event={"ID":"f4126f8f-7b88-4c50-82f3-3a91a3388519","Type":"ContainerStarted","Data":"961753a9758621b2b4df08ac706fec06943d4412a386ccce65914bbee92915ac"} Apr 22 21:09:51.766804 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:51.766707 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rpz8w" podStartSLOduration=67.925184082 podStartE2EDuration="1m9.766693181s" podCreationTimestamp="2026-04-22 21:08:42 +0000 UTC" firstStartedPulling="2026-04-22 21:09:48.988588005 +0000 UTC m=+67.097398896" lastFinishedPulling="2026-04-22 21:09:50.830097089 +0000 UTC m=+68.938907995" observedRunningTime="2026-04-22 21:09:51.765453547 +0000 UTC m=+69.874264472" watchObservedRunningTime="2026-04-22 21:09:51.766693181 +0000 UTC m=+69.875504095" Apr 22 21:09:51.881311 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:51.881277 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct"] Apr 22 21:09:51.903014 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:51.902985 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6854cd699f-kt8sj"] Apr 22 21:09:51.903014 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:51.903015 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct"] Apr 22 21:09:51.903220 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:51.903117 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct" Apr 22 21:09:51.905348 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:51.905324 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-ksszp\"" Apr 22 21:09:51.905674 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:51.905356 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 21:09:51.914711 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:51.914684 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c27159ec-9cf6-4f65-a975-c4509499046f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fsbct\" (UID: \"c27159ec-9cf6-4f65-a975-c4509499046f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct" Apr 22 21:09:52.001754 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.001671 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-m4m5v"] Apr 22 21:09:52.015419 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.015390 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c27159ec-9cf6-4f65-a975-c4509499046f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fsbct\" (UID: \"c27159ec-9cf6-4f65-a975-c4509499046f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct" Apr 22 21:09:52.027659 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.027636 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c27159ec-9cf6-4f65-a975-c4509499046f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fsbct\" (UID: \"c27159ec-9cf6-4f65-a975-c4509499046f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct" Apr 22 21:09:52.028442 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.028395 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-kz6zz"] Apr 22 21:09:52.028616 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.028591 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.031319 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.031294 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7tkh8\"" Apr 22 21:09:52.031319 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.031305 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 21:09:52.031483 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.031370 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 21:09:52.031483 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.031458 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 21:09:52.031711 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.031688 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 21:09:52.049573 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.049552 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-55c565d499-2vp9n"] Apr 22 21:09:52.049700 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.049686 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kz6zz" Apr 22 21:09:52.051869 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.051847 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 21:09:52.051947 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.051897 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 21:09:52.051947 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.051903 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-fm7xx\"" Apr 22 21:09:52.068094 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.068074 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kz6zz"] Apr 22 21:09:52.068164 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.068097 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m4m5v"] Apr 22 21:09:52.068164 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.068106 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55c565d499-2vp9n"] Apr 22 21:09:52.068232 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.068185 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.116403 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116376 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/97d77e34-3326-4a42-96bb-659316f50103-data-volume\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.116531 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116411 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-installation-pull-secrets\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.116531 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116456 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/97d77e34-3326-4a42-96bb-659316f50103-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.116531 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116486 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/97d77e34-3326-4a42-96bb-659316f50103-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.116531 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116519 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-image-registry-private-configuration\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.116709 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116586 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-registry-tls\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.116709 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116665 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksbp7\" (UniqueName: \"kubernetes.io/projected/97d77e34-3326-4a42-96bb-659316f50103-kube-api-access-ksbp7\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.116709 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116693 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ktjh\" (UniqueName: \"kubernetes.io/projected/7ed62681-dcb6-451c-970e-c5b940202a6b-kube-api-access-4ktjh\") pod \"downloads-6bcc868b7-kz6zz\" (UID: \"7ed62681-dcb6-451c-970e-c5b940202a6b\") " pod="openshift-console/downloads-6bcc868b7-kz6zz" Apr 22 21:09:52.116819 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116716 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-bound-sa-token\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.116819 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116732 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-ca-trust-extracted\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.116819 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116751 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-registry-certificates\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.116819 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116771 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/97d77e34-3326-4a42-96bb-659316f50103-crio-socket\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.116819 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116796 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wpzt\" (UniqueName: \"kubernetes.io/projected/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-kube-api-access-6wpzt\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.116819 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.116816 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-trusted-ca\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.212806 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.212775 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct" Apr 22 21:09:52.217606 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217588 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/97d77e34-3326-4a42-96bb-659316f50103-data-volume\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.217690 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217618 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-installation-pull-secrets\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.217690 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217647 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/97d77e34-3326-4a42-96bb-659316f50103-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.217690 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/97d77e34-3326-4a42-96bb-659316f50103-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.217832 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217701 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-image-registry-private-configuration\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.217832 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217726 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-registry-tls\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.217832 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217759 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksbp7\" (UniqueName: \"kubernetes.io/projected/97d77e34-3326-4a42-96bb-659316f50103-kube-api-access-ksbp7\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.217832 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217787 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ktjh\" (UniqueName: \"kubernetes.io/projected/7ed62681-dcb6-451c-970e-c5b940202a6b-kube-api-access-4ktjh\") pod \"downloads-6bcc868b7-kz6zz\" (UID: \"7ed62681-dcb6-451c-970e-c5b940202a6b\") " pod="openshift-console/downloads-6bcc868b7-kz6zz" Apr 22 21:09:52.217832 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217813 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-bound-sa-token\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.218075 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-ca-trust-extracted\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.218075 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217868 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-registry-certificates\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.218075 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217903 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/97d77e34-3326-4a42-96bb-659316f50103-crio-socket\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.218075 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217930 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6wpzt\" (UniqueName: \"kubernetes.io/projected/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-kube-api-access-6wpzt\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.218075 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217959 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-trusted-ca\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.218075 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.217992 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/97d77e34-3326-4a42-96bb-659316f50103-data-volume\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.218432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.218305 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/97d77e34-3326-4a42-96bb-659316f50103-crio-socket\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.218432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.218330 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/97d77e34-3326-4a42-96bb-659316f50103-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.218752 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.218726 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-ca-trust-extracted\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.218857 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.218832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-registry-certificates\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.219093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.219071 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-trusted-ca\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.220504 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.220475 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-installation-pull-secrets\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.221070 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.221033 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-image-registry-private-configuration\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.221148 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.221093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/97d77e34-3326-4a42-96bb-659316f50103-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.221406 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.221390 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-registry-tls\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.226803 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.226780 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-bound-sa-token\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.227158 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.227137 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wpzt\" (UniqueName: \"kubernetes.io/projected/dd1972a0-df78-46e1-adb6-9b8e74b0f9f1-kube-api-access-6wpzt\") pod \"image-registry-55c565d499-2vp9n\" (UID: \"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1\") " pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.227218 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.227204 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ktjh\" (UniqueName: \"kubernetes.io/projected/7ed62681-dcb6-451c-970e-c5b940202a6b-kube-api-access-4ktjh\") pod \"downloads-6bcc868b7-kz6zz\" (UID: \"7ed62681-dcb6-451c-970e-c5b940202a6b\") " pod="openshift-console/downloads-6bcc868b7-kz6zz" Apr 22 21:09:52.227638 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.227617 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksbp7\" (UniqueName: \"kubernetes.io/projected/97d77e34-3326-4a42-96bb-659316f50103-kube-api-access-ksbp7\") pod \"insights-runtime-extractor-m4m5v\" (UID: \"97d77e34-3326-4a42-96bb-659316f50103\") " pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.328035 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.327969 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct"] Apr 22 21:09:52.334310 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:52.333359 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc27159ec_9cf6_4f65_a975_c4509499046f.slice/crio-2fffbd454347426569480408143adab8ede540a98aea4dbb685fabb50f946e78 WatchSource:0}: Error finding container 2fffbd454347426569480408143adab8ede540a98aea4dbb685fabb50f946e78: Status 404 returned error can't find the container with id 2fffbd454347426569480408143adab8ede540a98aea4dbb685fabb50f946e78 Apr 22 21:09:52.338692 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.338670 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m4m5v" Apr 22 21:09:52.358618 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.358595 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kz6zz" Apr 22 21:09:52.376425 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.376400 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.494573 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.494447 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m4m5v"] Apr 22 21:09:52.497307 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:52.497277 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d77e34_3326_4a42_96bb_659316f50103.slice/crio-3bcc64a29f9343584e2216e6596c3775946df5b4eeaa9af44891b7570f78629d WatchSource:0}: Error finding container 3bcc64a29f9343584e2216e6596c3775946df5b4eeaa9af44891b7570f78629d: Status 404 returned error can't find the container with id 3bcc64a29f9343584e2216e6596c3775946df5b4eeaa9af44891b7570f78629d Apr 22 21:09:52.512186 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.512117 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kz6zz"] Apr 22 21:09:52.515092 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:52.515065 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ed62681_dcb6_451c_970e_c5b940202a6b.slice/crio-eb0f797f071211b1d6b127819f7a20515f0b2ed846964d1d64a83208262db237 WatchSource:0}: Error finding container eb0f797f071211b1d6b127819f7a20515f0b2ed846964d1d64a83208262db237: Status 404 returned error can't find the container with id eb0f797f071211b1d6b127819f7a20515f0b2ed846964d1d64a83208262db237 Apr 22 21:09:52.551951 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.551857 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-55c565d499-2vp9n"] Apr 22 21:09:52.553784 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:52.553758 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd1972a0_df78_46e1_adb6_9b8e74b0f9f1.slice/crio-f22f82b42f70e1485da37a8e67658cf58fa3c7e8c47c4374e3cbd94be0b47270 WatchSource:0}: Error finding container f22f82b42f70e1485da37a8e67658cf58fa3c7e8c47c4374e3cbd94be0b47270: Status 404 returned error can't find the container with id f22f82b42f70e1485da37a8e67658cf58fa3c7e8c47c4374e3cbd94be0b47270 Apr 22 21:09:52.754028 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.753991 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55c565d499-2vp9n" event={"ID":"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1","Type":"ContainerStarted","Data":"c438f2435072685c8a62c816146a6dab00b64f3ce1b70f3fbc039c7107ecb605"} Apr 22 21:09:52.754462 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.754037 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-55c565d499-2vp9n" event={"ID":"dd1972a0-df78-46e1-adb6-9b8e74b0f9f1","Type":"ContainerStarted","Data":"f22f82b42f70e1485da37a8e67658cf58fa3c7e8c47c4374e3cbd94be0b47270"} Apr 22 21:09:52.754462 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.754084 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:09:52.755139 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.755115 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kz6zz" event={"ID":"7ed62681-dcb6-451c-970e-c5b940202a6b","Type":"ContainerStarted","Data":"eb0f797f071211b1d6b127819f7a20515f0b2ed846964d1d64a83208262db237"} Apr 22 21:09:52.756415 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.756384 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4m5v" event={"ID":"97d77e34-3326-4a42-96bb-659316f50103","Type":"ContainerStarted","Data":"73912d89cd7ce2e215176f2cbb8c5bcb9509cddcd01ba89918f6128cc6bdc7e8"} Apr 22 21:09:52.756501 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.756415 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4m5v" event={"ID":"97d77e34-3326-4a42-96bb-659316f50103","Type":"ContainerStarted","Data":"3bcc64a29f9343584e2216e6596c3775946df5b4eeaa9af44891b7570f78629d"} Apr 22 21:09:52.757571 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.757551 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct" event={"ID":"c27159ec-9cf6-4f65-a975-c4509499046f","Type":"ContainerStarted","Data":"2fffbd454347426569480408143adab8ede540a98aea4dbb685fabb50f946e78"} Apr 22 21:09:52.772858 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:52.772781 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-55c565d499-2vp9n" podStartSLOduration=1.772765621 podStartE2EDuration="1.772765621s" podCreationTimestamp="2026-04-22 21:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:52.771319108 +0000 UTC m=+70.880130022" watchObservedRunningTime="2026-04-22 21:09:52.772765621 +0000 UTC m=+70.881576535" Apr 22 21:09:54.767565 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:54.767468 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4m5v" event={"ID":"97d77e34-3326-4a42-96bb-659316f50103","Type":"ContainerStarted","Data":"0f647e9cc774706dc02b77f2f7ca8d6efc0ec9836dd14fcabf8758d85b8d2fe7"} Apr 22 21:09:54.769567 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:54.769526 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct" event={"ID":"c27159ec-9cf6-4f65-a975-c4509499046f","Type":"ContainerStarted","Data":"623bbeed91db7ac8c6b298cc3765c4ad09ef371ac39e12801e871a28f56d3b60"} Apr 22 21:09:54.769898 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:54.769871 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct" Apr 22 21:09:54.776300 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:54.776276 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct" Apr 22 21:09:54.784555 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:54.784502 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fsbct" podStartSLOduration=1.7063920110000002 podStartE2EDuration="3.784486248s" podCreationTimestamp="2026-04-22 21:09:51 +0000 UTC" firstStartedPulling="2026-04-22 21:09:52.336135694 +0000 UTC m=+70.444946585" lastFinishedPulling="2026-04-22 21:09:54.414229929 +0000 UTC m=+72.523040822" observedRunningTime="2026-04-22 21:09:54.782752303 +0000 UTC m=+72.891563217" watchObservedRunningTime="2026-04-22 21:09:54.784486248 +0000 UTC m=+72.893297164" Apr 22 21:09:56.667659 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:56.667628 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7mmfr" Apr 22 21:09:56.776778 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:56.776743 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m4m5v" event={"ID":"97d77e34-3326-4a42-96bb-659316f50103","Type":"ContainerStarted","Data":"17c97de0e640da96a1043de2a1714d67ee7a71da070203a05d56468f93f6da54"} Apr 22 21:09:56.792115 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:56.791676 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-m4m5v" podStartSLOduration=2.439366449 podStartE2EDuration="5.791658064s" podCreationTimestamp="2026-04-22 21:09:51 +0000 UTC" firstStartedPulling="2026-04-22 21:09:52.6192134 +0000 UTC m=+70.728024291" lastFinishedPulling="2026-04-22 21:09:55.971505001 +0000 UTC m=+74.080315906" observedRunningTime="2026-04-22 21:09:56.7914148 +0000 UTC m=+74.900225712" watchObservedRunningTime="2026-04-22 21:09:56.791658064 +0000 UTC m=+74.900468968" Apr 22 21:09:57.311953 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:57.311906 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" podUID="70973331-d0fd-42a7-81d2-3009076d2a1f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 21:09:57.312136 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:57.311989 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" Apr 22 21:09:57.312712 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:57.312673 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"c22c1865ee37c3242e39045ab9a847be308c4eb5f1e2e0bdc926187375ca1dbe"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 21:09:57.312791 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:57.312738 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" podUID="70973331-d0fd-42a7-81d2-3009076d2a1f" containerName="service-proxy" containerID="cri-o://c22c1865ee37c3242e39045ab9a847be308c4eb5f1e2e0bdc926187375ca1dbe" gracePeriod=30 Apr 22 21:09:57.782841 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:57.782800 2570 generic.go:358] "Generic (PLEG): container finished" podID="70973331-d0fd-42a7-81d2-3009076d2a1f" containerID="c22c1865ee37c3242e39045ab9a847be308c4eb5f1e2e0bdc926187375ca1dbe" exitCode=2 Apr 22 21:09:57.783304 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:57.782894 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" event={"ID":"70973331-d0fd-42a7-81d2-3009076d2a1f","Type":"ContainerDied","Data":"c22c1865ee37c3242e39045ab9a847be308c4eb5f1e2e0bdc926187375ca1dbe"} Apr 22 21:09:57.783304 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:57.782949 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5f54c74c8c-vmvsk" event={"ID":"70973331-d0fd-42a7-81d2-3009076d2a1f","Type":"ContainerStarted","Data":"747f6e82607424ee95bc6ba784a800a06be3852912e4077aabd21567354d59f4"} Apr 22 21:09:59.651663 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.651634 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-46ljz"] Apr 22 21:09:59.656804 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.656781 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.659148 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.659086 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 21:09:59.659148 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.659096 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 21:09:59.659148 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.659111 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 21:09:59.659380 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.659114 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 21:09:59.660141 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.660118 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7bz2b\"" Apr 22 21:09:59.660232 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.660208 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 21:09:59.660321 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.660304 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 21:09:59.776762 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.776731 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zzls\" (UniqueName: \"kubernetes.io/projected/d197d611-b70d-4786-b1c4-59fd6632ebb9-kube-api-access-4zzls\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.776919 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.776783 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d197d611-b70d-4786-b1c4-59fd6632ebb9-metrics-client-ca\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.776919 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.776815 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-tls\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.776919 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.776864 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-accelerators-collector-config\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.776919 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.776886 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d197d611-b70d-4786-b1c4-59fd6632ebb9-sys\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.776919 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.776903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.777166 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.776932 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-textfile\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.777166 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.777023 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-wtmp\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.777166 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.777078 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d197d611-b70d-4786-b1c4-59fd6632ebb9-root\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.878344 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878313 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-textfile\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.878526 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878363 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-wtmp\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.878526 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878403 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d197d611-b70d-4786-b1c4-59fd6632ebb9-root\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.878526 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878437 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zzls\" (UniqueName: \"kubernetes.io/projected/d197d611-b70d-4786-b1c4-59fd6632ebb9-kube-api-access-4zzls\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.878526 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878475 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d197d611-b70d-4786-b1c4-59fd6632ebb9-metrics-client-ca\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.878526 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878504 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-tls\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.878780 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-accelerators-collector-config\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.878780 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878563 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d197d611-b70d-4786-b1c4-59fd6632ebb9-sys\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.878780 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.878921 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878798 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-textfile\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.879023 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.878982 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-wtmp\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.879158 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.879072 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d197d611-b70d-4786-b1c4-59fd6632ebb9-sys\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.879158 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.879120 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d197d611-b70d-4786-b1c4-59fd6632ebb9-root\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.879437 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.879389 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-accelerators-collector-config\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.879535 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.879506 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d197d611-b70d-4786-b1c4-59fd6632ebb9-metrics-client-ca\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.881552 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.881524 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.881836 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.881814 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d197d611-b70d-4786-b1c4-59fd6632ebb9-node-exporter-tls\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.890855 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.890833 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zzls\" (UniqueName: \"kubernetes.io/projected/d197d611-b70d-4786-b1c4-59fd6632ebb9-kube-api-access-4zzls\") pod \"node-exporter-46ljz\" (UID: \"d197d611-b70d-4786-b1c4-59fd6632ebb9\") " pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.967918 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:09:59.967877 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-46ljz" Apr 22 21:09:59.977224 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:09:59.977184 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd197d611_b70d_4786_b1c4_59fd6632ebb9.slice/crio-79616e5ae7393a6d17fa933b7a158ecb092ff0483e01178f41afa03e8b218107 WatchSource:0}: Error finding container 79616e5ae7393a6d17fa933b7a158ecb092ff0483e01178f41afa03e8b218107: Status 404 returned error can't find the container with id 79616e5ae7393a6d17fa933b7a158ecb092ff0483e01178f41afa03e8b218107 Apr 22 21:10:00.752224 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:00.751490 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5nj7r" Apr 22 21:10:00.795665 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:00.795630 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-46ljz" event={"ID":"d197d611-b70d-4786-b1c4-59fd6632ebb9","Type":"ContainerStarted","Data":"79616e5ae7393a6d17fa933b7a158ecb092ff0483e01178f41afa03e8b218107"} Apr 22 21:10:01.800237 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:01.800120 2570 generic.go:358] "Generic (PLEG): container finished" podID="d197d611-b70d-4786-b1c4-59fd6632ebb9" containerID="c8860fcb65dbcda32c0e98ebe67f01d246be95b86c864f5c36636dfdaa3e631b" exitCode=0 Apr 22 21:10:01.800237 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:01.800220 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-46ljz" event={"ID":"d197d611-b70d-4786-b1c4-59fd6632ebb9","Type":"ContainerDied","Data":"c8860fcb65dbcda32c0e98ebe67f01d246be95b86c864f5c36636dfdaa3e631b"} Apr 22 21:10:02.805952 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:02.805912 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-46ljz" event={"ID":"d197d611-b70d-4786-b1c4-59fd6632ebb9","Type":"ContainerStarted","Data":"33016ce6e27aa6e3824904feb6cf7131fc4ecd2a38a69de929295a3b1af07298"} Apr 22 21:10:02.805952 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:02.805957 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-46ljz" event={"ID":"d197d611-b70d-4786-b1c4-59fd6632ebb9","Type":"ContainerStarted","Data":"ba8b92f184b7212a7f22222d2bbb7d9e0d7391f8be53aec2c97dd638ae64dddb"} Apr 22 21:10:02.822723 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:02.822677 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-46ljz" podStartSLOduration=2.9512447699999997 podStartE2EDuration="3.822662203s" podCreationTimestamp="2026-04-22 21:09:59 +0000 UTC" firstStartedPulling="2026-04-22 21:09:59.979150923 +0000 UTC m=+78.087961814" lastFinishedPulling="2026-04-22 21:10:00.850568341 +0000 UTC m=+78.959379247" observedRunningTime="2026-04-22 21:10:02.821651517 +0000 UTC m=+80.930462428" watchObservedRunningTime="2026-04-22 21:10:02.822662203 +0000 UTC m=+80.931473116" Apr 22 21:10:11.909556 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:11.909522 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:10:12.838689 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:12.838645 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kz6zz" event={"ID":"7ed62681-dcb6-451c-970e-c5b940202a6b","Type":"ContainerStarted","Data":"9bf5df8af66aa40aab39056fb87300409062c01873efeb53707f6cd63ad028cb"} Apr 22 21:10:12.838916 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:12.838893 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-kz6zz" Apr 22 21:10:12.849603 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:12.849572 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-kz6zz" Apr 22 21:10:12.855821 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:12.855772 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-kz6zz" podStartSLOduration=1.947204591 podStartE2EDuration="21.855760226s" podCreationTimestamp="2026-04-22 21:09:51 +0000 UTC" firstStartedPulling="2026-04-22 21:09:52.516976678 +0000 UTC m=+70.625787575" lastFinishedPulling="2026-04-22 21:10:12.425532319 +0000 UTC m=+90.534343210" observedRunningTime="2026-04-22 21:10:12.854193836 +0000 UTC m=+90.963004745" watchObservedRunningTime="2026-04-22 21:10:12.855760226 +0000 UTC m=+90.964571138" Apr 22 21:10:13.765413 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:13.765382 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-55c565d499-2vp9n" Apr 22 21:10:16.922991 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:16.922951 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" podUID="c437a3d4-bcaa-4353-b17a-d8d4f6753b20" containerName="registry" containerID="cri-o://b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e" gracePeriod=30 Apr 22 21:10:17.193991 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.193968 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:10:17.328506 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.328472 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") pod \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " Apr 22 21:10:17.328506 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.328511 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4vm2\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-kube-api-access-g4vm2\") pod \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " Apr 22 21:10:17.328727 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.328544 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-installation-pull-secrets\") pod \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " Apr 22 21:10:17.328727 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.328577 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-certificates\") pod \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " Apr 22 21:10:17.328727 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.328618 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-trusted-ca\") pod \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " Apr 22 21:10:17.328727 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.328673 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-bound-sa-token\") pod \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " Apr 22 21:10:17.328727 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.328720 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-ca-trust-extracted\") pod \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " Apr 22 21:10:17.328983 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.328750 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-image-registry-private-configuration\") pod \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\" (UID: \"c437a3d4-bcaa-4353-b17a-d8d4f6753b20\") " Apr 22 21:10:17.329073 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.329038 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c437a3d4-bcaa-4353-b17a-d8d4f6753b20" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:17.329415 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.329358 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c437a3d4-bcaa-4353-b17a-d8d4f6753b20" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:17.331516 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.331470 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c437a3d4-bcaa-4353-b17a-d8d4f6753b20" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:17.331516 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.331497 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c437a3d4-bcaa-4353-b17a-d8d4f6753b20" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:17.331516 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.331508 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c437a3d4-bcaa-4353-b17a-d8d4f6753b20" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:17.331755 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.331636 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c437a3d4-bcaa-4353-b17a-d8d4f6753b20" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:17.331755 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.331707 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-kube-api-access-g4vm2" (OuterVolumeSpecName: "kube-api-access-g4vm2") pod "c437a3d4-bcaa-4353-b17a-d8d4f6753b20" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20"). InnerVolumeSpecName "kube-api-access-g4vm2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:17.340510 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.340478 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c437a3d4-bcaa-4353-b17a-d8d4f6753b20" (UID: "c437a3d4-bcaa-4353-b17a-d8d4f6753b20"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:10:17.430398 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.430324 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-ca-trust-extracted\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:10:17.430398 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.430361 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-image-registry-private-configuration\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:10:17.430398 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.430377 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:10:17.430398 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.430387 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g4vm2\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-kube-api-access-g4vm2\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:10:17.430398 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.430396 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-installation-pull-secrets\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:10:17.430748 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.430405 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-registry-certificates\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:10:17.430748 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.430419 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-trusted-ca\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:10:17.430748 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.430430 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c437a3d4-bcaa-4353-b17a-d8d4f6753b20-bound-sa-token\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:10:17.856717 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.856636 2570 generic.go:358] "Generic (PLEG): container finished" podID="c437a3d4-bcaa-4353-b17a-d8d4f6753b20" containerID="b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e" exitCode=0 Apr 22 21:10:17.856717 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.856687 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" event={"ID":"c437a3d4-bcaa-4353-b17a-d8d4f6753b20","Type":"ContainerDied","Data":"b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e"} Apr 22 21:10:17.857012 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.856721 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" Apr 22 21:10:17.857012 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.856753 2570 scope.go:117] "RemoveContainer" containerID="b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e" Apr 22 21:10:17.857012 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.856724 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6854cd699f-kt8sj" event={"ID":"c437a3d4-bcaa-4353-b17a-d8d4f6753b20","Type":"ContainerDied","Data":"2be1a10414b14adcf0d8fce7c450e648127f231fe238022e540bedd23f15d7dd"} Apr 22 21:10:17.866446 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.866423 2570 scope.go:117] "RemoveContainer" containerID="b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e" Apr 22 21:10:17.866803 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:10:17.866765 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e\": container with ID starting with b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e not found: ID does not exist" containerID="b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e" Apr 22 21:10:17.866906 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.866813 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e"} err="failed to get container status \"b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e\": rpc error: code = NotFound desc = could not find container \"b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e\": container with ID starting with b2728652b1aa5e3a2edbc34600f2b3af949a9a6e9e2ea656acb2ee1ed9f61a4e not found: ID does not exist" Apr 22 21:10:17.879265 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.879227 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6854cd699f-kt8sj"] Apr 22 21:10:17.882412 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:17.882388 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6854cd699f-kt8sj"] Apr 22 21:10:18.489441 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:10:18.489408 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c437a3d4-bcaa-4353-b17a-d8d4f6753b20" path="/var/lib/kubelet/pods/c437a3d4-bcaa-4353-b17a-d8d4f6753b20/volumes" Apr 22 21:12:31.928503 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.928467 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t"] Apr 22 21:12:31.928899 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.928811 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c437a3d4-bcaa-4353-b17a-d8d4f6753b20" containerName="registry" Apr 22 21:12:31.928899 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.928827 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c437a3d4-bcaa-4353-b17a-d8d4f6753b20" containerName="registry" Apr 22 21:12:31.928899 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.928888 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c437a3d4-bcaa-4353-b17a-d8d4f6753b20" containerName="registry" Apr 22 21:12:31.931750 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.931730 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:31.934646 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.934625 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jq55h\"" Apr 22 21:12:31.934912 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.934880 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 21:12:31.934912 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.934894 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 21:12:31.935064 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.934913 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 21:12:31.935064 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.934899 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 21:12:31.948669 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:31.948648 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t"] Apr 22 21:12:32.070841 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.070805 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82c088f5-8d4c-4aed-9424-eebf30592f8f-webhook-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nnr4t\" (UID: \"82c088f5-8d4c-4aed-9424-eebf30592f8f\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:32.071015 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.070933 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6hdt\" (UniqueName: \"kubernetes.io/projected/82c088f5-8d4c-4aed-9424-eebf30592f8f-kube-api-access-k6hdt\") pod \"opendatahub-operator-controller-manager-754bfc4657-nnr4t\" (UID: \"82c088f5-8d4c-4aed-9424-eebf30592f8f\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:32.071015 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.070970 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82c088f5-8d4c-4aed-9424-eebf30592f8f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nnr4t\" (UID: \"82c088f5-8d4c-4aed-9424-eebf30592f8f\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:32.171586 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.171554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82c088f5-8d4c-4aed-9424-eebf30592f8f-webhook-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nnr4t\" (UID: \"82c088f5-8d4c-4aed-9424-eebf30592f8f\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:32.171757 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.171610 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6hdt\" (UniqueName: \"kubernetes.io/projected/82c088f5-8d4c-4aed-9424-eebf30592f8f-kube-api-access-k6hdt\") pod \"opendatahub-operator-controller-manager-754bfc4657-nnr4t\" (UID: \"82c088f5-8d4c-4aed-9424-eebf30592f8f\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:32.171757 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.171632 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82c088f5-8d4c-4aed-9424-eebf30592f8f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nnr4t\" (UID: \"82c088f5-8d4c-4aed-9424-eebf30592f8f\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:32.174052 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.174032 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/82c088f5-8d4c-4aed-9424-eebf30592f8f-webhook-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nnr4t\" (UID: \"82c088f5-8d4c-4aed-9424-eebf30592f8f\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:32.174126 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.174086 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/82c088f5-8d4c-4aed-9424-eebf30592f8f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-nnr4t\" (UID: \"82c088f5-8d4c-4aed-9424-eebf30592f8f\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:32.180801 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.180748 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6hdt\" (UniqueName: \"kubernetes.io/projected/82c088f5-8d4c-4aed-9424-eebf30592f8f-kube-api-access-k6hdt\") pod \"opendatahub-operator-controller-manager-754bfc4657-nnr4t\" (UID: \"82c088f5-8d4c-4aed-9424-eebf30592f8f\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:32.242084 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.242058 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:32.357341 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:32.357305 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t"] Apr 22 21:12:32.361720 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:12:32.361690 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82c088f5_8d4c_4aed_9424_eebf30592f8f.slice/crio-0ebe851d7deee70d01affa30afced28950cf5636e95a0329e7282b0486ca9686 WatchSource:0}: Error finding container 0ebe851d7deee70d01affa30afced28950cf5636e95a0329e7282b0486ca9686: Status 404 returned error can't find the container with id 0ebe851d7deee70d01affa30afced28950cf5636e95a0329e7282b0486ca9686 Apr 22 21:12:33.210328 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:33.210288 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" event={"ID":"82c088f5-8d4c-4aed-9424-eebf30592f8f","Type":"ContainerStarted","Data":"0ebe851d7deee70d01affa30afced28950cf5636e95a0329e7282b0486ca9686"} Apr 22 21:12:35.216958 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:35.216925 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" event={"ID":"82c088f5-8d4c-4aed-9424-eebf30592f8f","Type":"ContainerStarted","Data":"de06478e9826f519e6bcc790c6d3ec770a10863f5f6f5e88fb81dfa3dd465b8e"} Apr 22 21:12:35.217338 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:35.217060 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:35.252267 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:35.252189 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" podStartSLOduration=1.54708398 podStartE2EDuration="4.252173506s" podCreationTimestamp="2026-04-22 21:12:31 +0000 UTC" firstStartedPulling="2026-04-22 21:12:32.36370295 +0000 UTC m=+230.472513841" lastFinishedPulling="2026-04-22 21:12:35.068792465 +0000 UTC m=+233.177603367" observedRunningTime="2026-04-22 21:12:35.250592676 +0000 UTC m=+233.359403590" watchObservedRunningTime="2026-04-22 21:12:35.252173506 +0000 UTC m=+233.360984431" Apr 22 21:12:46.221695 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:46.221667 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-nnr4t" Apr 22 21:12:48.963203 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.963169 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk"] Apr 22 21:12:48.966562 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.966543 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:48.969393 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.969368 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 21:12:48.970120 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.970101 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-ktfdx\"" Apr 22 21:12:48.970235 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.970117 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:12:48.970235 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.970107 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 21:12:48.970235 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.970152 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 21:12:48.970235 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.970207 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 21:12:48.982720 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.982697 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk"] Apr 22 21:12:48.992279 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.992243 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-manager-config\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:48.992380 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.992290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-metrics-cert\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:48.992451 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.992384 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-cert\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:48.992451 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:48.992420 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv492\" (UniqueName: \"kubernetes.io/projected/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-kube-api-access-vv492\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:49.093431 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:49.093390 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-cert\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:49.093431 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:49.093433 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vv492\" (UniqueName: \"kubernetes.io/projected/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-kube-api-access-vv492\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:49.093682 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:49.093468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-manager-config\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:49.093682 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:49.093494 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-metrics-cert\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:49.094456 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:49.094433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-manager-config\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:49.095914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:49.095882 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-cert\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:49.096076 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:49.096056 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-metrics-cert\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:49.107779 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:49.107757 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv492\" (UniqueName: \"kubernetes.io/projected/2a247d13-757d-4f7b-9e3e-d9ba0fec288c-kube-api-access-vv492\") pod \"lws-controller-manager-7979f84667-j9fqk\" (UID: \"2a247d13-757d-4f7b-9e3e-d9ba0fec288c\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:49.276062 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:49.275982 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:49.391942 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:49.391798 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk"] Apr 22 21:12:49.394489 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:12:49.394456 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a247d13_757d_4f7b_9e3e_d9ba0fec288c.slice/crio-79877edca9c4459b83ab566e92aa41d315449f08120ba462acebb3eb179d6352 WatchSource:0}: Error finding container 79877edca9c4459b83ab566e92aa41d315449f08120ba462acebb3eb179d6352: Status 404 returned error can't find the container with id 79877edca9c4459b83ab566e92aa41d315449f08120ba462acebb3eb179d6352 Apr 22 21:12:50.262219 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:50.262179 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" event={"ID":"2a247d13-757d-4f7b-9e3e-d9ba0fec288c","Type":"ContainerStarted","Data":"79877edca9c4459b83ab566e92aa41d315449f08120ba462acebb3eb179d6352"} Apr 22 21:12:52.268215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:52.268177 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" event={"ID":"2a247d13-757d-4f7b-9e3e-d9ba0fec288c","Type":"ContainerStarted","Data":"12c1c85ceb4005a18bcee7dd4b307a238ce11bbd9de65adae7ecf16631ddfb3d"} Apr 22 21:12:52.268667 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:52.268317 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:12:52.282138 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:12:52.282085 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" podStartSLOduration=1.8049219239999998 podStartE2EDuration="4.282070715s" podCreationTimestamp="2026-04-22 21:12:48 +0000 UTC" firstStartedPulling="2026-04-22 21:12:49.396290832 +0000 UTC m=+247.505101723" lastFinishedPulling="2026-04-22 21:12:51.873439611 +0000 UTC m=+249.982250514" observedRunningTime="2026-04-22 21:12:52.281696046 +0000 UTC m=+250.390506959" watchObservedRunningTime="2026-04-22 21:12:52.282070715 +0000 UTC m=+250.390881628" Apr 22 21:13:03.273349 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:03.273239 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7979f84667-j9fqk" Apr 22 21:13:36.381763 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.381732 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv"] Apr 22 21:13:36.392629 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.392602 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.395242 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.395214 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 21:13:36.395663 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.395635 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 21:13:36.395855 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.395829 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-rk5fj\"" Apr 22 21:13:36.396003 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.395980 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 21:13:36.396721 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.396697 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv"] Apr 22 21:13:36.533657 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.533618 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.533657 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.533668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.533878 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.533746 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.533878 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.533825 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.533988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.533885 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.533988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.533939 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.533988 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.533976 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.534134 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.534030 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxsrx\" (UniqueName: \"kubernetes.io/projected/04aebafb-0223-4fea-b000-baf860d9b3b7-kube-api-access-mxsrx\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.534134 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.534073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/04aebafb-0223-4fea-b000-baf860d9b3b7-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.634837 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.634744 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.634837 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.634784 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.634837 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.634810 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.634837 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.634833 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.635164 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.634879 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxsrx\" (UniqueName: \"kubernetes.io/projected/04aebafb-0223-4fea-b000-baf860d9b3b7-kube-api-access-mxsrx\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.635164 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.634906 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/04aebafb-0223-4fea-b000-baf860d9b3b7-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.635164 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.634952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.635164 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.635075 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.635164 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.635110 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.635464 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.635296 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.635464 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.635377 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.635464 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.635432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.635567 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.635483 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.635654 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.635635 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/04aebafb-0223-4fea-b000-baf860d9b3b7-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.637466 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.637449 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.637615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.637599 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.644596 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.644551 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/04aebafb-0223-4fea-b000-baf860d9b3b7-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.644732 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.644594 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxsrx\" (UniqueName: \"kubernetes.io/projected/04aebafb-0223-4fea-b000-baf860d9b3b7-kube-api-access-mxsrx\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fkv5fv\" (UID: \"04aebafb-0223-4fea-b000-baf860d9b3b7\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.706009 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.705973 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:36.830338 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:36.830304 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv"] Apr 22 21:13:36.834161 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:13:36.834133 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04aebafb_0223_4fea_b000_baf860d9b3b7.slice/crio-7500b64092017d699866f85caec8843b826f1a31dc7fbff7c84ba92e1e2c2d33 WatchSource:0}: Error finding container 7500b64092017d699866f85caec8843b826f1a31dc7fbff7c84ba92e1e2c2d33: Status 404 returned error can't find the container with id 7500b64092017d699866f85caec8843b826f1a31dc7fbff7c84ba92e1e2c2d33 Apr 22 21:13:37.382811 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:37.382775 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" event={"ID":"04aebafb-0223-4fea-b000-baf860d9b3b7","Type":"ContainerStarted","Data":"7500b64092017d699866f85caec8843b826f1a31dc7fbff7c84ba92e1e2c2d33"} Apr 22 21:13:41.755950 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:41.755915 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 21:13:41.756191 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:41.755993 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 21:13:41.756191 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:41.756024 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 21:13:42.400581 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:42.400494 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" event={"ID":"04aebafb-0223-4fea-b000-baf860d9b3b7","Type":"ContainerStarted","Data":"fbee1df50345252f5c4da860d6c0340f626a993f2d5d0c2a6a5a4131a99384d3"} Apr 22 21:13:42.412428 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:42.412399 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:13:42.412954 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:42.412934 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:13:42.416167 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:42.416145 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 21:13:42.421570 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:42.421528 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" podStartSLOduration=1.5018987419999998 podStartE2EDuration="6.42151701s" podCreationTimestamp="2026-04-22 21:13:36 +0000 UTC" firstStartedPulling="2026-04-22 21:13:36.836028837 +0000 UTC m=+294.944839731" lastFinishedPulling="2026-04-22 21:13:41.755647108 +0000 UTC m=+299.864457999" observedRunningTime="2026-04-22 21:13:42.420126157 +0000 UTC m=+300.528937070" watchObservedRunningTime="2026-04-22 21:13:42.42151701 +0000 UTC m=+300.530327924" Apr 22 21:13:42.706735 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:42.706703 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:42.711210 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:42.711186 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:43.404023 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:43.403992 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:43.405168 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:43.405150 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fkv5fv" Apr 22 21:13:50.096621 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.096589 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zvsfv"] Apr 22 21:13:50.099492 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.099474 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" Apr 22 21:13:50.101841 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.101820 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 21:13:50.102768 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.102742 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-txhh6\"" Apr 22 21:13:50.102768 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.102742 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 21:13:50.110234 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.110210 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zvsfv"] Apr 22 21:13:50.241196 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.241157 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzd6g\" (UniqueName: \"kubernetes.io/projected/a1b2e0d4-24a7-4774-9449-0f3e5ef6428b-kube-api-access-qzd6g\") pod \"kuadrant-operator-catalog-zvsfv\" (UID: \"a1b2e0d4-24a7-4774-9449-0f3e5ef6428b\") " pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" Apr 22 21:13:50.342617 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.342575 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzd6g\" (UniqueName: \"kubernetes.io/projected/a1b2e0d4-24a7-4774-9449-0f3e5ef6428b-kube-api-access-qzd6g\") pod \"kuadrant-operator-catalog-zvsfv\" (UID: \"a1b2e0d4-24a7-4774-9449-0f3e5ef6428b\") " pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" Apr 22 21:13:50.350601 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.350530 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzd6g\" (UniqueName: \"kubernetes.io/projected/a1b2e0d4-24a7-4774-9449-0f3e5ef6428b-kube-api-access-qzd6g\") pod \"kuadrant-operator-catalog-zvsfv\" (UID: \"a1b2e0d4-24a7-4774-9449-0f3e5ef6428b\") " pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" Apr 22 21:13:50.409466 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.409428 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" Apr 22 21:13:50.475778 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.475743 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zvsfv"] Apr 22 21:13:50.544149 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.544126 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zvsfv"] Apr 22 21:13:50.546446 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:13:50.546419 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b2e0d4_24a7_4774_9449_0f3e5ef6428b.slice/crio-b02cc50aef24a7ed638eaa289e1ba1739177bd599842531deeae19b7202175ae WatchSource:0}: Error finding container b02cc50aef24a7ed638eaa289e1ba1739177bd599842531deeae19b7202175ae: Status 404 returned error can't find the container with id b02cc50aef24a7ed638eaa289e1ba1739177bd599842531deeae19b7202175ae Apr 22 21:13:50.547568 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.547551 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:13:50.670777 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.670701 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ngbgq"] Apr 22 21:13:50.674801 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.674783 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" Apr 22 21:13:50.680070 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.680045 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ngbgq"] Apr 22 21:13:50.745413 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.745375 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxtwb\" (UniqueName: \"kubernetes.io/projected/1eb452c2-c921-4091-ab91-de530abb6130-kube-api-access-vxtwb\") pod \"kuadrant-operator-catalog-ngbgq\" (UID: \"1eb452c2-c921-4091-ab91-de530abb6130\") " pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" Apr 22 21:13:50.846081 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.846044 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxtwb\" (UniqueName: \"kubernetes.io/projected/1eb452c2-c921-4091-ab91-de530abb6130-kube-api-access-vxtwb\") pod \"kuadrant-operator-catalog-ngbgq\" (UID: \"1eb452c2-c921-4091-ab91-de530abb6130\") " pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" Apr 22 21:13:50.853544 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.853514 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxtwb\" (UniqueName: \"kubernetes.io/projected/1eb452c2-c921-4091-ab91-de530abb6130-kube-api-access-vxtwb\") pod \"kuadrant-operator-catalog-ngbgq\" (UID: \"1eb452c2-c921-4091-ab91-de530abb6130\") " pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" Apr 22 21:13:50.984903 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:50.984862 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" Apr 22 21:13:51.104467 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:51.104366 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ngbgq"] Apr 22 21:13:51.106886 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:13:51.106857 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eb452c2_c921_4091_ab91_de530abb6130.slice/crio-4fd474b90f4c88c722df200e638077ace8d91887f8455a81505f9b848b3e748a WatchSource:0}: Error finding container 4fd474b90f4c88c722df200e638077ace8d91887f8455a81505f9b848b3e748a: Status 404 returned error can't find the container with id 4fd474b90f4c88c722df200e638077ace8d91887f8455a81505f9b848b3e748a Apr 22 21:13:51.425761 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:51.425680 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" event={"ID":"a1b2e0d4-24a7-4774-9449-0f3e5ef6428b","Type":"ContainerStarted","Data":"b02cc50aef24a7ed638eaa289e1ba1739177bd599842531deeae19b7202175ae"} Apr 22 21:13:51.426679 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:51.426650 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" event={"ID":"1eb452c2-c921-4091-ab91-de530abb6130","Type":"ContainerStarted","Data":"4fd474b90f4c88c722df200e638077ace8d91887f8455a81505f9b848b3e748a"} Apr 22 21:13:53.436691 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:53.436650 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" event={"ID":"1eb452c2-c921-4091-ab91-de530abb6130","Type":"ContainerStarted","Data":"041e166f800a1c1b0b98edb74fd12061509c57e86f2d4bdb8055ddf3b0608817"} Apr 22 21:13:53.437914 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:53.437893 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" event={"ID":"a1b2e0d4-24a7-4774-9449-0f3e5ef6428b","Type":"ContainerStarted","Data":"be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb"} Apr 22 21:13:53.438049 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:53.438007 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" podUID="a1b2e0d4-24a7-4774-9449-0f3e5ef6428b" containerName="registry-server" containerID="cri-o://be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb" gracePeriod=2 Apr 22 21:13:53.452334 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:53.452238 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" podStartSLOduration=1.812896532 podStartE2EDuration="3.452223565s" podCreationTimestamp="2026-04-22 21:13:50 +0000 UTC" firstStartedPulling="2026-04-22 21:13:51.108330126 +0000 UTC m=+309.217141030" lastFinishedPulling="2026-04-22 21:13:52.747657172 +0000 UTC m=+310.856468063" observedRunningTime="2026-04-22 21:13:53.450184218 +0000 UTC m=+311.558995131" watchObservedRunningTime="2026-04-22 21:13:53.452223565 +0000 UTC m=+311.561034478" Apr 22 21:13:53.463104 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:53.463061 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" podStartSLOduration=1.263914095 podStartE2EDuration="3.463047865s" podCreationTimestamp="2026-04-22 21:13:50 +0000 UTC" firstStartedPulling="2026-04-22 21:13:50.547679739 +0000 UTC m=+308.656490631" lastFinishedPulling="2026-04-22 21:13:52.74681351 +0000 UTC m=+310.855624401" observedRunningTime="2026-04-22 21:13:53.462384661 +0000 UTC m=+311.571195599" watchObservedRunningTime="2026-04-22 21:13:53.463047865 +0000 UTC m=+311.571858778" Apr 22 21:13:53.669382 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:53.669358 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" Apr 22 21:13:53.773264 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:53.773164 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzd6g\" (UniqueName: \"kubernetes.io/projected/a1b2e0d4-24a7-4774-9449-0f3e5ef6428b-kube-api-access-qzd6g\") pod \"a1b2e0d4-24a7-4774-9449-0f3e5ef6428b\" (UID: \"a1b2e0d4-24a7-4774-9449-0f3e5ef6428b\") " Apr 22 21:13:53.775378 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:53.775355 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b2e0d4-24a7-4774-9449-0f3e5ef6428b-kube-api-access-qzd6g" (OuterVolumeSpecName: "kube-api-access-qzd6g") pod "a1b2e0d4-24a7-4774-9449-0f3e5ef6428b" (UID: "a1b2e0d4-24a7-4774-9449-0f3e5ef6428b"). InnerVolumeSpecName "kube-api-access-qzd6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:13:53.874165 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:53.874125 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qzd6g\" (UniqueName: \"kubernetes.io/projected/a1b2e0d4-24a7-4774-9449-0f3e5ef6428b-kube-api-access-qzd6g\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:13:54.441878 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:54.441838 2570 generic.go:358] "Generic (PLEG): container finished" podID="a1b2e0d4-24a7-4774-9449-0f3e5ef6428b" containerID="be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb" exitCode=0 Apr 22 21:13:54.442335 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:54.441900 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" Apr 22 21:13:54.442335 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:54.441931 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" event={"ID":"a1b2e0d4-24a7-4774-9449-0f3e5ef6428b","Type":"ContainerDied","Data":"be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb"} Apr 22 21:13:54.442335 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:54.441975 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-zvsfv" event={"ID":"a1b2e0d4-24a7-4774-9449-0f3e5ef6428b","Type":"ContainerDied","Data":"b02cc50aef24a7ed638eaa289e1ba1739177bd599842531deeae19b7202175ae"} Apr 22 21:13:54.442335 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:54.441998 2570 scope.go:117] "RemoveContainer" containerID="be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb" Apr 22 21:13:54.452561 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:54.452538 2570 scope.go:117] "RemoveContainer" containerID="be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb" Apr 22 21:13:54.452968 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:13:54.452942 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb\": container with ID starting with be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb not found: ID does not exist" containerID="be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb" Apr 22 21:13:54.453030 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:54.452980 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb"} err="failed to get container status \"be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb\": rpc error: code = NotFound desc = could not find container \"be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb\": container with ID starting with be11cb27cbec10581f1e3a0df7f407f4e50e2ff51d704ad6a0b3ecb24953a1fb not found: ID does not exist" Apr 22 21:13:54.465548 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:54.465517 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zvsfv"] Apr 22 21:13:54.466294 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:54.466275 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zvsfv"] Apr 22 21:13:54.489232 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:13:54.489205 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b2e0d4-24a7-4774-9449-0f3e5ef6428b" path="/var/lib/kubelet/pods/a1b2e0d4-24a7-4774-9449-0f3e5ef6428b/volumes" Apr 22 21:14:00.985266 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:00.985220 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" Apr 22 21:14:00.985743 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:00.985282 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" Apr 22 21:14:01.006379 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:01.006349 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" Apr 22 21:14:01.482097 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:01.482067 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-ngbgq" Apr 22 21:14:23.536238 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.536204 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5"] Apr 22 21:14:23.536781 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.536609 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1b2e0d4-24a7-4774-9449-0f3e5ef6428b" containerName="registry-server" Apr 22 21:14:23.536781 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.536628 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b2e0d4-24a7-4774-9449-0f3e5ef6428b" containerName="registry-server" Apr 22 21:14:23.536781 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.536695 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1b2e0d4-24a7-4774-9449-0f3e5ef6428b" containerName="registry-server" Apr 22 21:14:23.545742 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.545718 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:23.548222 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.548202 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-84s6w\"" Apr 22 21:14:23.554658 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.554636 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5"] Apr 22 21:14:23.587891 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.587855 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwsgx\" (UniqueName: \"kubernetes.io/projected/bac67062-c85e-48a0-93b6-96cb262a5fe2-kube-api-access-hwsgx\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6jll5\" (UID: \"bac67062-c85e-48a0-93b6-96cb262a5fe2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:23.588044 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.587938 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bac67062-c85e-48a0-93b6-96cb262a5fe2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6jll5\" (UID: \"bac67062-c85e-48a0-93b6-96cb262a5fe2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:23.688852 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.688811 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwsgx\" (UniqueName: \"kubernetes.io/projected/bac67062-c85e-48a0-93b6-96cb262a5fe2-kube-api-access-hwsgx\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6jll5\" (UID: \"bac67062-c85e-48a0-93b6-96cb262a5fe2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:23.689035 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.688897 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bac67062-c85e-48a0-93b6-96cb262a5fe2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6jll5\" (UID: \"bac67062-c85e-48a0-93b6-96cb262a5fe2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:23.689243 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.689226 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bac67062-c85e-48a0-93b6-96cb262a5fe2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6jll5\" (UID: \"bac67062-c85e-48a0-93b6-96cb262a5fe2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:23.702738 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.702712 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwsgx\" (UniqueName: \"kubernetes.io/projected/bac67062-c85e-48a0-93b6-96cb262a5fe2-kube-api-access-hwsgx\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6jll5\" (UID: \"bac67062-c85e-48a0-93b6-96cb262a5fe2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:23.855993 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.855904 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:23.981809 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:23.981781 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5"] Apr 22 21:14:23.984542 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:14:23.984505 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbac67062_c85e_48a0_93b6_96cb262a5fe2.slice/crio-74161e8675b38aa63b1a2449dc8067d3f345b92bfac3eb1e2bd3331dd35ea44b WatchSource:0}: Error finding container 74161e8675b38aa63b1a2449dc8067d3f345b92bfac3eb1e2bd3331dd35ea44b: Status 404 returned error can't find the container with id 74161e8675b38aa63b1a2449dc8067d3f345b92bfac3eb1e2bd3331dd35ea44b Apr 22 21:14:24.532072 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:24.532035 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" event={"ID":"bac67062-c85e-48a0-93b6-96cb262a5fe2","Type":"ContainerStarted","Data":"74161e8675b38aa63b1a2449dc8067d3f345b92bfac3eb1e2bd3331dd35ea44b"} Apr 22 21:14:29.549011 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:29.548976 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" event={"ID":"bac67062-c85e-48a0-93b6-96cb262a5fe2","Type":"ContainerStarted","Data":"aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e"} Apr 22 21:14:29.549402 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:29.549120 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:29.577343 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:29.577296 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" podStartSLOduration=1.647263629 podStartE2EDuration="6.57727911s" podCreationTimestamp="2026-04-22 21:14:23 +0000 UTC" firstStartedPulling="2026-04-22 21:14:23.987526872 +0000 UTC m=+342.096337763" lastFinishedPulling="2026-04-22 21:14:28.917542348 +0000 UTC m=+347.026353244" observedRunningTime="2026-04-22 21:14:29.575386995 +0000 UTC m=+347.684197909" watchObservedRunningTime="2026-04-22 21:14:29.57727911 +0000 UTC m=+347.686090021" Apr 22 21:14:31.961436 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:31.961355 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft"] Apr 22 21:14:31.964451 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:31.964427 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:31.966779 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:31.966755 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 21:14:31.967020 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:31.967000 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-d8nc4\"" Apr 22 21:14:31.967113 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:31.967002 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 21:14:31.973013 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:31.972988 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft"] Apr 22 21:14:32.057203 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.057169 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29d9d6f1-be0b-4aa1-9588-49353ca77887-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-d9hft\" (UID: \"29d9d6f1-be0b-4aa1-9588-49353ca77887\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:32.057203 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.057206 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/29d9d6f1-be0b-4aa1-9588-49353ca77887-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-d9hft\" (UID: \"29d9d6f1-be0b-4aa1-9588-49353ca77887\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:32.057434 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.057232 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9gzc\" (UniqueName: \"kubernetes.io/projected/29d9d6f1-be0b-4aa1-9588-49353ca77887-kube-api-access-c9gzc\") pod \"kuadrant-console-plugin-6cb54b5c86-d9hft\" (UID: \"29d9d6f1-be0b-4aa1-9588-49353ca77887\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:32.158593 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.158559 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29d9d6f1-be0b-4aa1-9588-49353ca77887-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-d9hft\" (UID: \"29d9d6f1-be0b-4aa1-9588-49353ca77887\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:32.158593 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.158596 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/29d9d6f1-be0b-4aa1-9588-49353ca77887-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-d9hft\" (UID: \"29d9d6f1-be0b-4aa1-9588-49353ca77887\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:32.158801 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.158621 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9gzc\" (UniqueName: \"kubernetes.io/projected/29d9d6f1-be0b-4aa1-9588-49353ca77887-kube-api-access-c9gzc\") pod \"kuadrant-console-plugin-6cb54b5c86-d9hft\" (UID: \"29d9d6f1-be0b-4aa1-9588-49353ca77887\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:32.159200 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.159172 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29d9d6f1-be0b-4aa1-9588-49353ca77887-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-d9hft\" (UID: \"29d9d6f1-be0b-4aa1-9588-49353ca77887\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:32.161202 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.161181 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/29d9d6f1-be0b-4aa1-9588-49353ca77887-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-d9hft\" (UID: \"29d9d6f1-be0b-4aa1-9588-49353ca77887\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:32.168300 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.168242 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9gzc\" (UniqueName: \"kubernetes.io/projected/29d9d6f1-be0b-4aa1-9588-49353ca77887-kube-api-access-c9gzc\") pod \"kuadrant-console-plugin-6cb54b5c86-d9hft\" (UID: \"29d9d6f1-be0b-4aa1-9588-49353ca77887\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:32.273622 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.273531 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" Apr 22 21:14:32.393344 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.393320 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft"] Apr 22 21:14:32.395825 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:14:32.395797 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29d9d6f1_be0b_4aa1_9588_49353ca77887.slice/crio-61dbb701e2da55690ecff81208da8d68fc96ef9990e72603bf9c3489a9a244e5 WatchSource:0}: Error finding container 61dbb701e2da55690ecff81208da8d68fc96ef9990e72603bf9c3489a9a244e5: Status 404 returned error can't find the container with id 61dbb701e2da55690ecff81208da8d68fc96ef9990e72603bf9c3489a9a244e5 Apr 22 21:14:32.559608 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:32.559524 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" event={"ID":"29d9d6f1-be0b-4aa1-9588-49353ca77887","Type":"ContainerStarted","Data":"61dbb701e2da55690ecff81208da8d68fc96ef9990e72603bf9c3489a9a244e5"} Apr 22 21:14:40.555711 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:40.555676 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:42.297615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.297563 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5"] Apr 22 21:14:42.298080 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.297820 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" podUID="bac67062-c85e-48a0-93b6-96cb262a5fe2" containerName="manager" containerID="cri-o://aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e" gracePeriod=2 Apr 22 21:14:42.308602 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.308546 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5"] Apr 22 21:14:42.317019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.316317 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72"] Apr 22 21:14:42.317019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.316707 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bac67062-c85e-48a0-93b6-96cb262a5fe2" containerName="manager" Apr 22 21:14:42.317019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.316725 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac67062-c85e-48a0-93b6-96cb262a5fe2" containerName="manager" Apr 22 21:14:42.317019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.316811 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="bac67062-c85e-48a0-93b6-96cb262a5fe2" containerName="manager" Apr 22 21:14:42.320389 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.320369 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:14:42.332296 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.332268 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72"] Apr 22 21:14:42.442216 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.442182 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/24f780ce-459d-42f0-aa44-187bafe06354-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2pc72\" (UID: \"24f780ce-459d-42f0-aa44-187bafe06354\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:14:42.442372 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.442221 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5859p\" (UniqueName: \"kubernetes.io/projected/24f780ce-459d-42f0-aa44-187bafe06354-kube-api-access-5859p\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2pc72\" (UID: \"24f780ce-459d-42f0-aa44-187bafe06354\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:14:42.538907 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.538881 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:42.543233 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.543205 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/24f780ce-459d-42f0-aa44-187bafe06354-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2pc72\" (UID: \"24f780ce-459d-42f0-aa44-187bafe06354\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:14:42.543387 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.543242 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5859p\" (UniqueName: \"kubernetes.io/projected/24f780ce-459d-42f0-aa44-187bafe06354-kube-api-access-5859p\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2pc72\" (UID: \"24f780ce-459d-42f0-aa44-187bafe06354\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:14:42.543669 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.543642 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/24f780ce-459d-42f0-aa44-187bafe06354-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2pc72\" (UID: \"24f780ce-459d-42f0-aa44-187bafe06354\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:14:42.551971 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.551911 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5859p\" (UniqueName: \"kubernetes.io/projected/24f780ce-459d-42f0-aa44-187bafe06354-kube-api-access-5859p\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-2pc72\" (UID: \"24f780ce-459d-42f0-aa44-187bafe06354\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:14:42.593963 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.593926 2570 generic.go:358] "Generic (PLEG): container finished" podID="bac67062-c85e-48a0-93b6-96cb262a5fe2" containerID="aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e" exitCode=0 Apr 22 21:14:42.594109 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.593977 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6jll5" Apr 22 21:14:42.594109 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.594001 2570 scope.go:117] "RemoveContainer" containerID="aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e" Apr 22 21:14:42.601410 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.601391 2570 scope.go:117] "RemoveContainer" containerID="aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e" Apr 22 21:14:42.601691 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:14:42.601664 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e\": container with ID starting with aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e not found: ID does not exist" containerID="aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e" Apr 22 21:14:42.601769 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.601695 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e"} err="failed to get container status \"aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e\": rpc error: code = NotFound desc = could not find container \"aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e\": container with ID starting with aeae1732037b38c891c9cad87a8d71265706e8292e175f5ae7d642534e2b930e not found: ID does not exist" Apr 22 21:14:42.644155 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.644122 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bac67062-c85e-48a0-93b6-96cb262a5fe2-extensions-socket-volume\") pod \"bac67062-c85e-48a0-93b6-96cb262a5fe2\" (UID: \"bac67062-c85e-48a0-93b6-96cb262a5fe2\") " Apr 22 21:14:42.644336 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.644184 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwsgx\" (UniqueName: \"kubernetes.io/projected/bac67062-c85e-48a0-93b6-96cb262a5fe2-kube-api-access-hwsgx\") pod \"bac67062-c85e-48a0-93b6-96cb262a5fe2\" (UID: \"bac67062-c85e-48a0-93b6-96cb262a5fe2\") " Apr 22 21:14:42.644738 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.644707 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bac67062-c85e-48a0-93b6-96cb262a5fe2-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "bac67062-c85e-48a0-93b6-96cb262a5fe2" (UID: "bac67062-c85e-48a0-93b6-96cb262a5fe2"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:14:42.646552 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.646516 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bac67062-c85e-48a0-93b6-96cb262a5fe2-kube-api-access-hwsgx" (OuterVolumeSpecName: "kube-api-access-hwsgx") pod "bac67062-c85e-48a0-93b6-96cb262a5fe2" (UID: "bac67062-c85e-48a0-93b6-96cb262a5fe2"). InnerVolumeSpecName "kube-api-access-hwsgx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:14:42.685003 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.684970 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:14:42.745517 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.745472 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/bac67062-c85e-48a0-93b6-96cb262a5fe2-extensions-socket-volume\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:14:42.745517 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.745507 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwsgx\" (UniqueName: \"kubernetes.io/projected/bac67062-c85e-48a0-93b6-96cb262a5fe2-kube-api-access-hwsgx\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:14:42.817861 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:42.817829 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72"] Apr 22 21:14:44.491737 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:44.491698 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bac67062-c85e-48a0-93b6-96cb262a5fe2" path="/var/lib/kubelet/pods/bac67062-c85e-48a0-93b6-96cb262a5fe2/volumes" Apr 22 21:14:56.899776 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:14:56.899742 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24f780ce_459d_42f0_aa44_187bafe06354.slice/crio-3373d8c51f214a30c0eb0864f6e324fd803eba0e7cdba31c50ca6bdfc4f4cfe1 WatchSource:0}: Error finding container 3373d8c51f214a30c0eb0864f6e324fd803eba0e7cdba31c50ca6bdfc4f4cfe1: Status 404 returned error can't find the container with id 3373d8c51f214a30c0eb0864f6e324fd803eba0e7cdba31c50ca6bdfc4f4cfe1 Apr 22 21:14:57.651047 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:57.651009 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" event={"ID":"24f780ce-459d-42f0-aa44-187bafe06354","Type":"ContainerStarted","Data":"e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c"} Apr 22 21:14:57.651262 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:57.651055 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" event={"ID":"24f780ce-459d-42f0-aa44-187bafe06354","Type":"ContainerStarted","Data":"3373d8c51f214a30c0eb0864f6e324fd803eba0e7cdba31c50ca6bdfc4f4cfe1"} Apr 22 21:14:57.651262 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:57.651145 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:14:57.652598 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:57.652564 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" event={"ID":"29d9d6f1-be0b-4aa1-9588-49353ca77887","Type":"ContainerStarted","Data":"a3fce2e79edea1517200754188238d58c15763cb0ab26348d92967a69a25490e"} Apr 22 21:14:57.669622 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:57.669582 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" podStartSLOduration=15.669571957 podStartE2EDuration="15.669571957s" podCreationTimestamp="2026-04-22 21:14:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:14:57.66831444 +0000 UTC m=+375.777125352" watchObservedRunningTime="2026-04-22 21:14:57.669571957 +0000 UTC m=+375.778382870" Apr 22 21:14:57.682172 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:57.682123 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-d9hft" podStartSLOduration=2.097543197 podStartE2EDuration="26.682108189s" podCreationTimestamp="2026-04-22 21:14:31 +0000 UTC" firstStartedPulling="2026-04-22 21:14:32.397045824 +0000 UTC m=+350.505856715" lastFinishedPulling="2026-04-22 21:14:56.981610812 +0000 UTC m=+375.090421707" observedRunningTime="2026-04-22 21:14:57.681413488 +0000 UTC m=+375.790224402" watchObservedRunningTime="2026-04-22 21:14:57.682108189 +0000 UTC m=+375.790919103" Apr 22 21:14:59.099669 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:59.099634 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72"] Apr 22 21:14:59.659196 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:59.659154 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" podUID="24f780ce-459d-42f0-aa44-187bafe06354" containerName="manager" containerID="cri-o://e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c" gracePeriod=10 Apr 22 21:14:59.893782 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:59.893761 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:14:59.988175 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:59.988137 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/24f780ce-459d-42f0-aa44-187bafe06354-extensions-socket-volume\") pod \"24f780ce-459d-42f0-aa44-187bafe06354\" (UID: \"24f780ce-459d-42f0-aa44-187bafe06354\") " Apr 22 21:14:59.988175 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:59.988181 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5859p\" (UniqueName: \"kubernetes.io/projected/24f780ce-459d-42f0-aa44-187bafe06354-kube-api-access-5859p\") pod \"24f780ce-459d-42f0-aa44-187bafe06354\" (UID: \"24f780ce-459d-42f0-aa44-187bafe06354\") " Apr 22 21:14:59.988615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:59.988586 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24f780ce-459d-42f0-aa44-187bafe06354-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "24f780ce-459d-42f0-aa44-187bafe06354" (UID: "24f780ce-459d-42f0-aa44-187bafe06354"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:14:59.990297 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:14:59.990268 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f780ce-459d-42f0-aa44-187bafe06354-kube-api-access-5859p" (OuterVolumeSpecName: "kube-api-access-5859p") pod "24f780ce-459d-42f0-aa44-187bafe06354" (UID: "24f780ce-459d-42f0-aa44-187bafe06354"). InnerVolumeSpecName "kube-api-access-5859p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:15:00.089443 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.089393 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/24f780ce-459d-42f0-aa44-187bafe06354-extensions-socket-volume\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:15:00.089443 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.089439 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5859p\" (UniqueName: \"kubernetes.io/projected/24f780ce-459d-42f0-aa44-187bafe06354-kube-api-access-5859p\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:15:00.663245 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.663212 2570 generic.go:358] "Generic (PLEG): container finished" podID="24f780ce-459d-42f0-aa44-187bafe06354" containerID="e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c" exitCode=0 Apr 22 21:15:00.663693 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.663288 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" event={"ID":"24f780ce-459d-42f0-aa44-187bafe06354","Type":"ContainerDied","Data":"e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c"} Apr 22 21:15:00.663693 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.663318 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" event={"ID":"24f780ce-459d-42f0-aa44-187bafe06354","Type":"ContainerDied","Data":"3373d8c51f214a30c0eb0864f6e324fd803eba0e7cdba31c50ca6bdfc4f4cfe1"} Apr 22 21:15:00.663693 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.663335 2570 scope.go:117] "RemoveContainer" containerID="e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c" Apr 22 21:15:00.663693 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.663291 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72" Apr 22 21:15:00.671643 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.671615 2570 scope.go:117] "RemoveContainer" containerID="e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c" Apr 22 21:15:00.671915 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:15:00.671896 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c\": container with ID starting with e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c not found: ID does not exist" containerID="e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c" Apr 22 21:15:00.671968 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.671923 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c"} err="failed to get container status \"e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c\": rpc error: code = NotFound desc = could not find container \"e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c\": container with ID starting with e28dd846e512594cf8b8d936816e5ca12d231f42113a3f4c2c755a6f9c70f94c not found: ID does not exist" Apr 22 21:15:00.680924 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.680899 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72"] Apr 22 21:15:00.685330 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:00.685307 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-2pc72"] Apr 22 21:15:02.489039 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:02.489001 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f780ce-459d-42f0-aa44-187bafe06354" path="/var/lib/kubelet/pods/24f780ce-459d-42f0-aa44-187bafe06354/volumes" Apr 22 21:15:15.290830 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.290791 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq"] Apr 22 21:15:15.291300 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.291090 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24f780ce-459d-42f0-aa44-187bafe06354" containerName="manager" Apr 22 21:15:15.291300 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.291101 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f780ce-459d-42f0-aa44-187bafe06354" containerName="manager" Apr 22 21:15:15.291300 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.291157 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="24f780ce-459d-42f0-aa44-187bafe06354" containerName="manager" Apr 22 21:15:15.333535 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.333504 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq"] Apr 22 21:15:15.333695 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.333626 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.336215 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.336193 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-dxjpt\"" Apr 22 21:15:15.515029 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.514994 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.515029 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.515027 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.515227 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.515052 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.515227 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.515120 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.515227 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.515195 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf4r2\" (UniqueName: \"kubernetes.io/projected/684e367c-ce68-4452-b7f4-9d7004a05e85-kube-api-access-xf4r2\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.515347 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.515226 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.515347 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.515272 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.515347 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.515301 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/684e367c-ce68-4452-b7f4-9d7004a05e85-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.515347 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.515323 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.616652 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.616564 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xf4r2\" (UniqueName: \"kubernetes.io/projected/684e367c-ce68-4452-b7f4-9d7004a05e85-kube-api-access-xf4r2\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.616652 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.616609 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.616652 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.616652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.616895 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.616684 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/684e367c-ce68-4452-b7f4-9d7004a05e85-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.616895 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.616725 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.616895 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.616776 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.616895 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.616799 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.616895 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.616823 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.616895 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.616851 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.617187 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.617161 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.617276 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.617211 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.617451 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.617421 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.617557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.617432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.617557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.617434 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/684e367c-ce68-4452-b7f4-9d7004a05e85-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.619000 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.618983 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.619222 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.619205 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.624568 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.624546 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/684e367c-ce68-4452-b7f4-9d7004a05e85-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.625399 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.625377 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf4r2\" (UniqueName: \"kubernetes.io/projected/684e367c-ce68-4452-b7f4-9d7004a05e85-kube-api-access-xf4r2\") pod \"maas-default-gateway-openshift-default-845c6b4b48-7slzq\" (UID: \"684e367c-ce68-4452-b7f4-9d7004a05e85\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.645318 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.645298 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:15.767521 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.767439 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq"] Apr 22 21:15:15.769632 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:15:15.769598 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod684e367c_ce68_4452_b7f4_9d7004a05e85.slice/crio-5d4b9533a5b60a3333c820a31e601896408c90603dd95181ddd4753bf088a3ef WatchSource:0}: Error finding container 5d4b9533a5b60a3333c820a31e601896408c90603dd95181ddd4753bf088a3ef: Status 404 returned error can't find the container with id 5d4b9533a5b60a3333c820a31e601896408c90603dd95181ddd4753bf088a3ef Apr 22 21:15:15.771863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.771833 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 21:15:15.771959 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.771893 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 21:15:15.771959 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:15.771939 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 21:15:16.716833 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:16.716802 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" event={"ID":"684e367c-ce68-4452-b7f4-9d7004a05e85","Type":"ContainerStarted","Data":"b5e8f773500d7af565c907ffc65155deafba883e451a4076c624a5962191aebd"} Apr 22 21:15:16.716833 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:16.716837 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" event={"ID":"684e367c-ce68-4452-b7f4-9d7004a05e85","Type":"ContainerStarted","Data":"5d4b9533a5b60a3333c820a31e601896408c90603dd95181ddd4753bf088a3ef"} Apr 22 21:15:16.735743 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:16.735692 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" podStartSLOduration=1.73567625 podStartE2EDuration="1.73567625s" podCreationTimestamp="2026-04-22 21:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:15:16.733915767 +0000 UTC m=+394.842726680" watchObservedRunningTime="2026-04-22 21:15:16.73567625 +0000 UTC m=+394.844487162" Apr 22 21:15:17.646326 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:17.646290 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:17.651051 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:17.651027 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:17.719936 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:17.719898 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:17.720839 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:17.720824 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-7slzq" Apr 22 21:15:19.513037 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.512963 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6jhtq"] Apr 22 21:15:19.516228 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.516211 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:19.518567 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.518550 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 21:15:19.526086 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.526059 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6jhtq"] Apr 22 21:15:19.610808 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.610775 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6jhtq"] Apr 22 21:15:19.649602 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.649564 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp48n\" (UniqueName: \"kubernetes.io/projected/8630fed9-52e5-423c-a89b-746ef1cfa0e1-kube-api-access-hp48n\") pod \"limitador-limitador-7d549b5b-6jhtq\" (UID: \"8630fed9-52e5-423c-a89b-746ef1cfa0e1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:19.649751 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.649631 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8630fed9-52e5-423c-a89b-746ef1cfa0e1-config-file\") pod \"limitador-limitador-7d549b5b-6jhtq\" (UID: \"8630fed9-52e5-423c-a89b-746ef1cfa0e1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:19.750627 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.750589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8630fed9-52e5-423c-a89b-746ef1cfa0e1-config-file\") pod \"limitador-limitador-7d549b5b-6jhtq\" (UID: \"8630fed9-52e5-423c-a89b-746ef1cfa0e1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:19.750791 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.750645 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hp48n\" (UniqueName: \"kubernetes.io/projected/8630fed9-52e5-423c-a89b-746ef1cfa0e1-kube-api-access-hp48n\") pod \"limitador-limitador-7d549b5b-6jhtq\" (UID: \"8630fed9-52e5-423c-a89b-746ef1cfa0e1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:19.751440 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.751418 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8630fed9-52e5-423c-a89b-746ef1cfa0e1-config-file\") pod \"limitador-limitador-7d549b5b-6jhtq\" (UID: \"8630fed9-52e5-423c-a89b-746ef1cfa0e1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:19.757735 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.757708 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp48n\" (UniqueName: \"kubernetes.io/projected/8630fed9-52e5-423c-a89b-746ef1cfa0e1-kube-api-access-hp48n\") pod \"limitador-limitador-7d549b5b-6jhtq\" (UID: \"8630fed9-52e5-423c-a89b-746ef1cfa0e1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:19.830900 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.830811 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:19.897052 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.897023 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:15:19.901498 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.901481 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" Apr 22 21:15:19.908083 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.908037 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:15:19.929030 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.929004 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:15:19.954328 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:19.954295 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6jhtq"] Apr 22 21:15:19.958019 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:15:19.957986 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8630fed9_52e5_423c_a89b_746ef1cfa0e1.slice/crio-dfe32bf93f97db64cf04b0bfbadcbb4d7acbeb9da0bf7295d90a28f15f916a74 WatchSource:0}: Error finding container dfe32bf93f97db64cf04b0bfbadcbb4d7acbeb9da0bf7295d90a28f15f916a74: Status 404 returned error can't find the container with id dfe32bf93f97db64cf04b0bfbadcbb4d7acbeb9da0bf7295d90a28f15f916a74 Apr 22 21:15:20.053291 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:20.053228 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhmcf\" (UniqueName: \"kubernetes.io/projected/b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4-kube-api-access-xhmcf\") pod \"limitador-limitador-78c99df468-kr4vc\" (UID: \"b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4\") " pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" Apr 22 21:15:20.053291 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:20.053289 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4-config-file\") pod \"limitador-limitador-78c99df468-kr4vc\" (UID: \"b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4\") " pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" Apr 22 21:15:20.154641 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:20.154547 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhmcf\" (UniqueName: \"kubernetes.io/projected/b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4-kube-api-access-xhmcf\") pod \"limitador-limitador-78c99df468-kr4vc\" (UID: \"b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4\") " pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" Apr 22 21:15:20.154641 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:20.154624 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4-config-file\") pod \"limitador-limitador-78c99df468-kr4vc\" (UID: \"b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4\") " pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" Apr 22 21:15:20.155208 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:20.155184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4-config-file\") pod \"limitador-limitador-78c99df468-kr4vc\" (UID: \"b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4\") " pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" Apr 22 21:15:20.164286 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:20.164260 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhmcf\" (UniqueName: \"kubernetes.io/projected/b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4-kube-api-access-xhmcf\") pod \"limitador-limitador-78c99df468-kr4vc\" (UID: \"b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4\") " pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" Apr 22 21:15:20.217161 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:20.217128 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" Apr 22 21:15:20.331887 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:20.331769 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:15:20.334466 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:15:20.334439 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4cfd43b_fb04_4855_b4b9_0dbbd49c42b4.slice/crio-129793cfea46fa5f1a03f00d80ea588917566ed5861cc7cd7c8db8b639313677 WatchSource:0}: Error finding container 129793cfea46fa5f1a03f00d80ea588917566ed5861cc7cd7c8db8b639313677: Status 404 returned error can't find the container with id 129793cfea46fa5f1a03f00d80ea588917566ed5861cc7cd7c8db8b639313677 Apr 22 21:15:20.730711 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:20.730667 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" event={"ID":"b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4","Type":"ContainerStarted","Data":"129793cfea46fa5f1a03f00d80ea588917566ed5861cc7cd7c8db8b639313677"} Apr 22 21:15:20.731767 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:20.731743 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" event={"ID":"8630fed9-52e5-423c-a89b-746ef1cfa0e1","Type":"ContainerStarted","Data":"dfe32bf93f97db64cf04b0bfbadcbb4d7acbeb9da0bf7295d90a28f15f916a74"} Apr 22 21:15:23.743813 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:23.743774 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" event={"ID":"8630fed9-52e5-423c-a89b-746ef1cfa0e1","Type":"ContainerStarted","Data":"f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978"} Apr 22 21:15:23.744282 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:23.743907 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:23.745146 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:23.745124 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" event={"ID":"b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4","Type":"ContainerStarted","Data":"594e33694eff37b5421ef86dd71e63a47036e7f7dfee7154cc537ac30c3008a2"} Apr 22 21:15:23.745277 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:23.745258 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" Apr 22 21:15:23.759594 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:23.759549 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" podStartSLOduration=1.7220414320000001 podStartE2EDuration="4.759536225s" podCreationTimestamp="2026-04-22 21:15:19 +0000 UTC" firstStartedPulling="2026-04-22 21:15:19.959779814 +0000 UTC m=+398.068590704" lastFinishedPulling="2026-04-22 21:15:22.997274603 +0000 UTC m=+401.106085497" observedRunningTime="2026-04-22 21:15:23.757468734 +0000 UTC m=+401.866279659" watchObservedRunningTime="2026-04-22 21:15:23.759536225 +0000 UTC m=+401.868347138" Apr 22 21:15:23.771913 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:23.771856 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" podStartSLOduration=2.100475268 podStartE2EDuration="4.771844538s" podCreationTimestamp="2026-04-22 21:15:19 +0000 UTC" firstStartedPulling="2026-04-22 21:15:20.336206592 +0000 UTC m=+398.445017483" lastFinishedPulling="2026-04-22 21:15:23.007575847 +0000 UTC m=+401.116386753" observedRunningTime="2026-04-22 21:15:23.770636141 +0000 UTC m=+401.879447054" watchObservedRunningTime="2026-04-22 21:15:23.771844538 +0000 UTC m=+401.880655450" Apr 22 21:15:34.750023 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:34.749989 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-kr4vc" Apr 22 21:15:34.750466 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:34.750038 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:34.818534 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:34.818501 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6jhtq"] Apr 22 21:15:34.818718 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:34.818689 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" podUID="8630fed9-52e5-423c-a89b-746ef1cfa0e1" containerName="limitador" containerID="cri-o://f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978" gracePeriod=30 Apr 22 21:15:35.756751 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.756725 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:35.784058 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.784020 2570 generic.go:358] "Generic (PLEG): container finished" podID="8630fed9-52e5-423c-a89b-746ef1cfa0e1" containerID="f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978" exitCode=0 Apr 22 21:15:35.784199 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.784078 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" Apr 22 21:15:35.784199 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.784101 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" event={"ID":"8630fed9-52e5-423c-a89b-746ef1cfa0e1","Type":"ContainerDied","Data":"f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978"} Apr 22 21:15:35.784199 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.784140 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-6jhtq" event={"ID":"8630fed9-52e5-423c-a89b-746ef1cfa0e1","Type":"ContainerDied","Data":"dfe32bf93f97db64cf04b0bfbadcbb4d7acbeb9da0bf7295d90a28f15f916a74"} Apr 22 21:15:35.784199 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.784156 2570 scope.go:117] "RemoveContainer" containerID="f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978" Apr 22 21:15:35.785001 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.784984 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp48n\" (UniqueName: \"kubernetes.io/projected/8630fed9-52e5-423c-a89b-746ef1cfa0e1-kube-api-access-hp48n\") pod \"8630fed9-52e5-423c-a89b-746ef1cfa0e1\" (UID: \"8630fed9-52e5-423c-a89b-746ef1cfa0e1\") " Apr 22 21:15:35.785122 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.785029 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8630fed9-52e5-423c-a89b-746ef1cfa0e1-config-file\") pod \"8630fed9-52e5-423c-a89b-746ef1cfa0e1\" (UID: \"8630fed9-52e5-423c-a89b-746ef1cfa0e1\") " Apr 22 21:15:35.785523 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.785495 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8630fed9-52e5-423c-a89b-746ef1cfa0e1-config-file" (OuterVolumeSpecName: "config-file") pod "8630fed9-52e5-423c-a89b-746ef1cfa0e1" (UID: "8630fed9-52e5-423c-a89b-746ef1cfa0e1"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:15:35.787186 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.787162 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8630fed9-52e5-423c-a89b-746ef1cfa0e1-kube-api-access-hp48n" (OuterVolumeSpecName: "kube-api-access-hp48n") pod "8630fed9-52e5-423c-a89b-746ef1cfa0e1" (UID: "8630fed9-52e5-423c-a89b-746ef1cfa0e1"). InnerVolumeSpecName "kube-api-access-hp48n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:15:35.796149 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.796127 2570 scope.go:117] "RemoveContainer" containerID="f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978" Apr 22 21:15:35.796438 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:15:35.796416 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978\": container with ID starting with f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978 not found: ID does not exist" containerID="f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978" Apr 22 21:15:35.796490 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.796448 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978"} err="failed to get container status \"f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978\": rpc error: code = NotFound desc = could not find container \"f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978\": container with ID starting with f1e8aee398b1add51bc669b667a0ec38a3ff232c382be5ff835d451a484b4978 not found: ID does not exist" Apr 22 21:15:35.886522 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.886433 2570 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/8630fed9-52e5-423c-a89b-746ef1cfa0e1-config-file\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:15:35.886522 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:35.886465 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hp48n\" (UniqueName: \"kubernetes.io/projected/8630fed9-52e5-423c-a89b-746ef1cfa0e1-kube-api-access-hp48n\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:15:36.106602 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:36.106571 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6jhtq"] Apr 22 21:15:36.109494 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:36.109464 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-6jhtq"] Apr 22 21:15:36.489625 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:36.489593 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8630fed9-52e5-423c-a89b-746ef1cfa0e1" path="/var/lib/kubelet/pods/8630fed9-52e5-423c-a89b-746ef1cfa0e1/volumes" Apr 22 21:15:56.337037 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.336998 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-hxxfk"] Apr 22 21:15:56.337525 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.337364 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8630fed9-52e5-423c-a89b-746ef1cfa0e1" containerName="limitador" Apr 22 21:15:56.337525 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.337378 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630fed9-52e5-423c-a89b-746ef1cfa0e1" containerName="limitador" Apr 22 21:15:56.337525 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.337430 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8630fed9-52e5-423c-a89b-746ef1cfa0e1" containerName="limitador" Apr 22 21:15:56.346793 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.346736 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" Apr 22 21:15:56.347118 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.347082 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-hxxfk"] Apr 22 21:15:56.349044 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.349021 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-xw8js\"" Apr 22 21:15:56.466611 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.466580 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjsdc\" (UniqueName: \"kubernetes.io/projected/218f3f80-1d77-40c6-904b-86c941da22a2-kube-api-access-cjsdc\") pod \"maas-controller-6d4c8f55f9-hxxfk\" (UID: \"218f3f80-1d77-40c6-904b-86c941da22a2\") " pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" Apr 22 21:15:56.489992 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.489963 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-b44d57d8d-wswwn"] Apr 22 21:15:56.493408 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.493392 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b44d57d8d-wswwn" Apr 22 21:15:56.499023 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.498984 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-b44d57d8d-wswwn"] Apr 22 21:15:56.567704 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.567662 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjsdc\" (UniqueName: \"kubernetes.io/projected/218f3f80-1d77-40c6-904b-86c941da22a2-kube-api-access-cjsdc\") pod \"maas-controller-6d4c8f55f9-hxxfk\" (UID: \"218f3f80-1d77-40c6-904b-86c941da22a2\") " pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" Apr 22 21:15:56.567884 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.567733 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hfmf\" (UniqueName: \"kubernetes.io/projected/f5e53f50-9601-4b94-8f11-92dce248be21-kube-api-access-8hfmf\") pod \"maas-controller-b44d57d8d-wswwn\" (UID: \"f5e53f50-9601-4b94-8f11-92dce248be21\") " pod="opendatahub/maas-controller-b44d57d8d-wswwn" Apr 22 21:15:56.575866 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.575845 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjsdc\" (UniqueName: \"kubernetes.io/projected/218f3f80-1d77-40c6-904b-86c941da22a2-kube-api-access-cjsdc\") pod \"maas-controller-6d4c8f55f9-hxxfk\" (UID: \"218f3f80-1d77-40c6-904b-86c941da22a2\") " pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" Apr 22 21:15:56.605436 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.605359 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-hxxfk"] Apr 22 21:15:56.605600 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.605589 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" Apr 22 21:15:56.628454 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.628416 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-cbw52"] Apr 22 21:15:56.633151 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.633125 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" Apr 22 21:15:56.642266 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.642212 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-cbw52"] Apr 22 21:15:56.668222 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.668187 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hfmf\" (UniqueName: \"kubernetes.io/projected/f5e53f50-9601-4b94-8f11-92dce248be21-kube-api-access-8hfmf\") pod \"maas-controller-b44d57d8d-wswwn\" (UID: \"f5e53f50-9601-4b94-8f11-92dce248be21\") " pod="opendatahub/maas-controller-b44d57d8d-wswwn" Apr 22 21:15:56.676188 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.676162 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hfmf\" (UniqueName: \"kubernetes.io/projected/f5e53f50-9601-4b94-8f11-92dce248be21-kube-api-access-8hfmf\") pod \"maas-controller-b44d57d8d-wswwn\" (UID: \"f5e53f50-9601-4b94-8f11-92dce248be21\") " pod="opendatahub/maas-controller-b44d57d8d-wswwn" Apr 22 21:15:56.727028 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.727003 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-hxxfk"] Apr 22 21:15:56.729083 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:15:56.729052 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod218f3f80_1d77_40c6_904b_86c941da22a2.slice/crio-4bab4b4a7bcabd4364cdb431159db578b872519fad455c9ba6e5c7a26534fd7f WatchSource:0}: Error finding container 4bab4b4a7bcabd4364cdb431159db578b872519fad455c9ba6e5c7a26534fd7f: Status 404 returned error can't find the container with id 4bab4b4a7bcabd4364cdb431159db578b872519fad455c9ba6e5c7a26534fd7f Apr 22 21:15:56.769528 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.769494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk8nq\" (UniqueName: \"kubernetes.io/projected/2d2cd98a-b04e-4126-8be0-8b870d9e8701-kube-api-access-vk8nq\") pod \"maas-controller-5c6497bbdb-cbw52\" (UID: \"2d2cd98a-b04e-4126-8be0-8b870d9e8701\") " pod="opendatahub/maas-controller-5c6497bbdb-cbw52" Apr 22 21:15:56.804533 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.804500 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b44d57d8d-wswwn" Apr 22 21:15:56.849571 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.849535 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" event={"ID":"218f3f80-1d77-40c6-904b-86c941da22a2","Type":"ContainerStarted","Data":"4bab4b4a7bcabd4364cdb431159db578b872519fad455c9ba6e5c7a26534fd7f"} Apr 22 21:15:56.870146 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.870076 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vk8nq\" (UniqueName: \"kubernetes.io/projected/2d2cd98a-b04e-4126-8be0-8b870d9e8701-kube-api-access-vk8nq\") pod \"maas-controller-5c6497bbdb-cbw52\" (UID: \"2d2cd98a-b04e-4126-8be0-8b870d9e8701\") " pod="opendatahub/maas-controller-5c6497bbdb-cbw52" Apr 22 21:15:56.877824 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.877800 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk8nq\" (UniqueName: \"kubernetes.io/projected/2d2cd98a-b04e-4126-8be0-8b870d9e8701-kube-api-access-vk8nq\") pod \"maas-controller-5c6497bbdb-cbw52\" (UID: \"2d2cd98a-b04e-4126-8be0-8b870d9e8701\") " pod="opendatahub/maas-controller-5c6497bbdb-cbw52" Apr 22 21:15:56.917019 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.916946 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-b44d57d8d-wswwn"] Apr 22 21:15:56.919100 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:15:56.919060 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5e53f50_9601_4b94_8f11_92dce248be21.slice/crio-1042ecea174ea12cc9988f514a93445658221f0356f271c9aa00b520ec02845e WatchSource:0}: Error finding container 1042ecea174ea12cc9988f514a93445658221f0356f271c9aa00b520ec02845e: Status 404 returned error can't find the container with id 1042ecea174ea12cc9988f514a93445658221f0356f271c9aa00b520ec02845e Apr 22 21:15:56.945195 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:56.945168 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" Apr 22 21:15:57.064873 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:57.064850 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-cbw52"] Apr 22 21:15:57.067330 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:15:57.067303 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d2cd98a_b04e_4126_8be0_8b870d9e8701.slice/crio-a931ded9f8739c90553c600f1648d8284b321343ccf53cec9385e574a0463abd WatchSource:0}: Error finding container a931ded9f8739c90553c600f1648d8284b321343ccf53cec9385e574a0463abd: Status 404 returned error can't find the container with id a931ded9f8739c90553c600f1648d8284b321343ccf53cec9385e574a0463abd Apr 22 21:15:57.857800 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:57.857758 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b44d57d8d-wswwn" event={"ID":"f5e53f50-9601-4b94-8f11-92dce248be21","Type":"ContainerStarted","Data":"1042ecea174ea12cc9988f514a93445658221f0356f271c9aa00b520ec02845e"} Apr 22 21:15:57.860011 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:15:57.859973 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" event={"ID":"2d2cd98a-b04e-4126-8be0-8b870d9e8701","Type":"ContainerStarted","Data":"a931ded9f8739c90553c600f1648d8284b321343ccf53cec9385e574a0463abd"} Apr 22 21:16:00.873037 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:00.872999 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" event={"ID":"2d2cd98a-b04e-4126-8be0-8b870d9e8701","Type":"ContainerStarted","Data":"93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106"} Apr 22 21:16:00.873499 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:00.873205 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" Apr 22 21:16:00.874419 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:00.874392 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" event={"ID":"218f3f80-1d77-40c6-904b-86c941da22a2","Type":"ContainerStarted","Data":"2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1"} Apr 22 21:16:00.874573 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:00.874480 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" Apr 22 21:16:00.874573 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:00.874461 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" podUID="218f3f80-1d77-40c6-904b-86c941da22a2" containerName="manager" containerID="cri-o://2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1" gracePeriod=10 Apr 22 21:16:00.875699 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:00.875677 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b44d57d8d-wswwn" event={"ID":"f5e53f50-9601-4b94-8f11-92dce248be21","Type":"ContainerStarted","Data":"2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7"} Apr 22 21:16:00.875811 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:00.875800 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-b44d57d8d-wswwn" Apr 22 21:16:00.891025 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:00.890985 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" podStartSLOduration=1.6482180629999998 podStartE2EDuration="4.890972737s" podCreationTimestamp="2026-04-22 21:15:56 +0000 UTC" firstStartedPulling="2026-04-22 21:15:57.068599617 +0000 UTC m=+435.177410507" lastFinishedPulling="2026-04-22 21:16:00.311354273 +0000 UTC m=+438.420165181" observedRunningTime="2026-04-22 21:16:00.889960047 +0000 UTC m=+438.998770961" watchObservedRunningTime="2026-04-22 21:16:00.890972737 +0000 UTC m=+438.999783650" Apr 22 21:16:00.907351 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:00.907308 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-b44d57d8d-wswwn" podStartSLOduration=1.527018489 podStartE2EDuration="4.907293184s" podCreationTimestamp="2026-04-22 21:15:56 +0000 UTC" firstStartedPulling="2026-04-22 21:15:56.920416291 +0000 UTC m=+435.029227183" lastFinishedPulling="2026-04-22 21:16:00.300690988 +0000 UTC m=+438.409501878" observedRunningTime="2026-04-22 21:16:00.906046671 +0000 UTC m=+439.014857608" watchObservedRunningTime="2026-04-22 21:16:00.907293184 +0000 UTC m=+439.016104096" Apr 22 21:16:00.922293 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:00.922220 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" podStartSLOduration=1.351889539 podStartE2EDuration="4.922205544s" podCreationTimestamp="2026-04-22 21:15:56 +0000 UTC" firstStartedPulling="2026-04-22 21:15:56.730264744 +0000 UTC m=+434.839075635" lastFinishedPulling="2026-04-22 21:16:00.300580733 +0000 UTC m=+438.409391640" observedRunningTime="2026-04-22 21:16:00.921132169 +0000 UTC m=+439.029943076" watchObservedRunningTime="2026-04-22 21:16:00.922205544 +0000 UTC m=+439.031016475" Apr 22 21:16:01.108678 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.108657 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" Apr 22 21:16:01.208344 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.208314 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjsdc\" (UniqueName: \"kubernetes.io/projected/218f3f80-1d77-40c6-904b-86c941da22a2-kube-api-access-cjsdc\") pod \"218f3f80-1d77-40c6-904b-86c941da22a2\" (UID: \"218f3f80-1d77-40c6-904b-86c941da22a2\") " Apr 22 21:16:01.210471 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.210446 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218f3f80-1d77-40c6-904b-86c941da22a2-kube-api-access-cjsdc" (OuterVolumeSpecName: "kube-api-access-cjsdc") pod "218f3f80-1d77-40c6-904b-86c941da22a2" (UID: "218f3f80-1d77-40c6-904b-86c941da22a2"). InnerVolumeSpecName "kube-api-access-cjsdc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:16:01.309432 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.309392 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cjsdc\" (UniqueName: \"kubernetes.io/projected/218f3f80-1d77-40c6-904b-86c941da22a2-kube-api-access-cjsdc\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:16:01.880099 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.880012 2570 generic.go:358] "Generic (PLEG): container finished" podID="218f3f80-1d77-40c6-904b-86c941da22a2" containerID="2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1" exitCode=0 Apr 22 21:16:01.880099 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.880086 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" Apr 22 21:16:01.880615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.880110 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" event={"ID":"218f3f80-1d77-40c6-904b-86c941da22a2","Type":"ContainerDied","Data":"2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1"} Apr 22 21:16:01.880615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.880159 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-hxxfk" event={"ID":"218f3f80-1d77-40c6-904b-86c941da22a2","Type":"ContainerDied","Data":"4bab4b4a7bcabd4364cdb431159db578b872519fad455c9ba6e5c7a26534fd7f"} Apr 22 21:16:01.880615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.880182 2570 scope.go:117] "RemoveContainer" containerID="2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1" Apr 22 21:16:01.888560 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.888487 2570 scope.go:117] "RemoveContainer" containerID="2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1" Apr 22 21:16:01.888777 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:16:01.888751 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1\": container with ID starting with 2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1 not found: ID does not exist" containerID="2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1" Apr 22 21:16:01.888853 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.888789 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1"} err="failed to get container status \"2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1\": rpc error: code = NotFound desc = could not find container \"2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1\": container with ID starting with 2a6c0623481c892f82fb53dbb21e3215ceaabb51cf12c0b000ff669b5a4859e1 not found: ID does not exist" Apr 22 21:16:01.902605 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.902579 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-hxxfk"] Apr 22 21:16:01.906101 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:01.906076 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-hxxfk"] Apr 22 21:16:02.490847 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.490808 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218f3f80-1d77-40c6-904b-86c941da22a2" path="/var/lib/kubelet/pods/218f3f80-1d77-40c6-904b-86c941da22a2/volumes" Apr 22 21:16:02.529672 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.529635 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-65dcbbcf47-mw2ps"] Apr 22 21:16:02.529995 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.529978 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="218f3f80-1d77-40c6-904b-86c941da22a2" containerName="manager" Apr 22 21:16:02.529995 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.529995 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="218f3f80-1d77-40c6-904b-86c941da22a2" containerName="manager" Apr 22 21:16:02.530158 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.530072 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="218f3f80-1d77-40c6-904b-86c941da22a2" containerName="manager" Apr 22 21:16:02.533012 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.532991 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:02.535242 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.535221 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 21:16:02.535366 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.535222 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 21:16:02.535763 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.535737 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qxg7n\"" Apr 22 21:16:02.541812 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.541792 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-65dcbbcf47-mw2ps"] Apr 22 21:16:02.619057 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.619018 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/53767960-6700-4ee0-98e0-452669ba38ef-maas-api-tls\") pod \"maas-api-65dcbbcf47-mw2ps\" (UID: \"53767960-6700-4ee0-98e0-452669ba38ef\") " pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:02.619223 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.619066 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mjk\" (UniqueName: \"kubernetes.io/projected/53767960-6700-4ee0-98e0-452669ba38ef-kube-api-access-59mjk\") pod \"maas-api-65dcbbcf47-mw2ps\" (UID: \"53767960-6700-4ee0-98e0-452669ba38ef\") " pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:02.720374 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.720335 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/53767960-6700-4ee0-98e0-452669ba38ef-maas-api-tls\") pod \"maas-api-65dcbbcf47-mw2ps\" (UID: \"53767960-6700-4ee0-98e0-452669ba38ef\") " pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:02.720557 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.720391 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59mjk\" (UniqueName: \"kubernetes.io/projected/53767960-6700-4ee0-98e0-452669ba38ef-kube-api-access-59mjk\") pod \"maas-api-65dcbbcf47-mw2ps\" (UID: \"53767960-6700-4ee0-98e0-452669ba38ef\") " pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:02.720557 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:16:02.720490 2570 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 22 21:16:02.720646 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:16:02.720576 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53767960-6700-4ee0-98e0-452669ba38ef-maas-api-tls podName:53767960-6700-4ee0-98e0-452669ba38ef nodeName:}" failed. No retries permitted until 2026-04-22 21:16:03.220553874 +0000 UTC m=+441.329364783 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/53767960-6700-4ee0-98e0-452669ba38ef-maas-api-tls") pod "maas-api-65dcbbcf47-mw2ps" (UID: "53767960-6700-4ee0-98e0-452669ba38ef") : secret "maas-api-serving-cert" not found Apr 22 21:16:02.731497 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:02.731465 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mjk\" (UniqueName: \"kubernetes.io/projected/53767960-6700-4ee0-98e0-452669ba38ef-kube-api-access-59mjk\") pod \"maas-api-65dcbbcf47-mw2ps\" (UID: \"53767960-6700-4ee0-98e0-452669ba38ef\") " pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:03.224775 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:03.224742 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/53767960-6700-4ee0-98e0-452669ba38ef-maas-api-tls\") pod \"maas-api-65dcbbcf47-mw2ps\" (UID: \"53767960-6700-4ee0-98e0-452669ba38ef\") " pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:03.227051 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:03.227031 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/53767960-6700-4ee0-98e0-452669ba38ef-maas-api-tls\") pod \"maas-api-65dcbbcf47-mw2ps\" (UID: \"53767960-6700-4ee0-98e0-452669ba38ef\") " pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:03.267675 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:03.267640 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:16:03.447566 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:03.447530 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:03.565574 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:03.565544 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-65dcbbcf47-mw2ps"] Apr 22 21:16:03.569063 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:16:03.569031 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53767960_6700_4ee0_98e0_452669ba38ef.slice/crio-241f3681806b28b80b36b3cee72ec56ffd0e201d03613146d587aad80905113a WatchSource:0}: Error finding container 241f3681806b28b80b36b3cee72ec56ffd0e201d03613146d587aad80905113a: Status 404 returned error can't find the container with id 241f3681806b28b80b36b3cee72ec56ffd0e201d03613146d587aad80905113a Apr 22 21:16:03.888690 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:03.888599 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" event={"ID":"53767960-6700-4ee0-98e0-452669ba38ef","Type":"ContainerStarted","Data":"241f3681806b28b80b36b3cee72ec56ffd0e201d03613146d587aad80905113a"} Apr 22 21:16:05.899948 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:05.899898 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" event={"ID":"53767960-6700-4ee0-98e0-452669ba38ef","Type":"ContainerStarted","Data":"ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea"} Apr 22 21:16:05.900330 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:05.900073 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:05.915683 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:05.915633 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" podStartSLOduration=2.502218296 podStartE2EDuration="3.915612614s" podCreationTimestamp="2026-04-22 21:16:02 +0000 UTC" firstStartedPulling="2026-04-22 21:16:03.570608439 +0000 UTC m=+441.679419330" lastFinishedPulling="2026-04-22 21:16:04.984002758 +0000 UTC m=+443.092813648" observedRunningTime="2026-04-22 21:16:05.914572962 +0000 UTC m=+444.023383874" watchObservedRunningTime="2026-04-22 21:16:05.915612614 +0000 UTC m=+444.024423528" Apr 22 21:16:11.884909 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:11.884876 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-b44d57d8d-wswwn" Apr 22 21:16:11.885303 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:11.884938 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" Apr 22 21:16:11.907947 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:11.907917 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:11.939702 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:11.939669 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-b44d57d8d-wswwn"] Apr 22 21:16:11.939931 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:11.939906 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-b44d57d8d-wswwn" podUID="f5e53f50-9601-4b94-8f11-92dce248be21" containerName="manager" containerID="cri-o://2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7" gracePeriod=10 Apr 22 21:16:12.202787 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.202763 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b44d57d8d-wswwn" Apr 22 21:16:12.237852 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.237816 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-db6f5bcdb-lm526"] Apr 22 21:16:12.238375 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.238355 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5e53f50-9601-4b94-8f11-92dce248be21" containerName="manager" Apr 22 21:16:12.238447 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.238380 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5e53f50-9601-4b94-8f11-92dce248be21" containerName="manager" Apr 22 21:16:12.238485 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.238465 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5e53f50-9601-4b94-8f11-92dce248be21" containerName="manager" Apr 22 21:16:12.242125 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.242107 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db6f5bcdb-lm526" Apr 22 21:16:12.249904 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.249878 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-db6f5bcdb-lm526"] Apr 22 21:16:12.310825 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.310792 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hfmf\" (UniqueName: \"kubernetes.io/projected/f5e53f50-9601-4b94-8f11-92dce248be21-kube-api-access-8hfmf\") pod \"f5e53f50-9601-4b94-8f11-92dce248be21\" (UID: \"f5e53f50-9601-4b94-8f11-92dce248be21\") " Apr 22 21:16:12.312897 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.312867 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5e53f50-9601-4b94-8f11-92dce248be21-kube-api-access-8hfmf" (OuterVolumeSpecName: "kube-api-access-8hfmf") pod "f5e53f50-9601-4b94-8f11-92dce248be21" (UID: "f5e53f50-9601-4b94-8f11-92dce248be21"). InnerVolumeSpecName "kube-api-access-8hfmf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:16:12.412062 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.412024 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84pbt\" (UniqueName: \"kubernetes.io/projected/79e26af0-a55d-4e55-9bc8-26b7efbfc8ae-kube-api-access-84pbt\") pod \"maas-controller-db6f5bcdb-lm526\" (UID: \"79e26af0-a55d-4e55-9bc8-26b7efbfc8ae\") " pod="opendatahub/maas-controller-db6f5bcdb-lm526" Apr 22 21:16:12.412223 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.412099 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8hfmf\" (UniqueName: \"kubernetes.io/projected/f5e53f50-9601-4b94-8f11-92dce248be21-kube-api-access-8hfmf\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:16:12.513284 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.513236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84pbt\" (UniqueName: \"kubernetes.io/projected/79e26af0-a55d-4e55-9bc8-26b7efbfc8ae-kube-api-access-84pbt\") pod \"maas-controller-db6f5bcdb-lm526\" (UID: \"79e26af0-a55d-4e55-9bc8-26b7efbfc8ae\") " pod="opendatahub/maas-controller-db6f5bcdb-lm526" Apr 22 21:16:12.521068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.521042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84pbt\" (UniqueName: \"kubernetes.io/projected/79e26af0-a55d-4e55-9bc8-26b7efbfc8ae-kube-api-access-84pbt\") pod \"maas-controller-db6f5bcdb-lm526\" (UID: \"79e26af0-a55d-4e55-9bc8-26b7efbfc8ae\") " pod="opendatahub/maas-controller-db6f5bcdb-lm526" Apr 22 21:16:12.554049 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.554021 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db6f5bcdb-lm526" Apr 22 21:16:12.672620 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.672596 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-db6f5bcdb-lm526"] Apr 22 21:16:12.674992 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:16:12.674967 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e26af0_a55d_4e55_9bc8_26b7efbfc8ae.slice/crio-6925afb47d65a68a8d74af72ed27a202f6e7a63a0dbd9e9ee3a28fe98e699b38 WatchSource:0}: Error finding container 6925afb47d65a68a8d74af72ed27a202f6e7a63a0dbd9e9ee3a28fe98e699b38: Status 404 returned error can't find the container with id 6925afb47d65a68a8d74af72ed27a202f6e7a63a0dbd9e9ee3a28fe98e699b38 Apr 22 21:16:12.924351 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.924268 2570 generic.go:358] "Generic (PLEG): container finished" podID="f5e53f50-9601-4b94-8f11-92dce248be21" containerID="2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7" exitCode=0 Apr 22 21:16:12.924351 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.924329 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b44d57d8d-wswwn" Apr 22 21:16:12.924351 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.924340 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b44d57d8d-wswwn" event={"ID":"f5e53f50-9601-4b94-8f11-92dce248be21","Type":"ContainerDied","Data":"2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7"} Apr 22 21:16:12.924855 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.924379 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b44d57d8d-wswwn" event={"ID":"f5e53f50-9601-4b94-8f11-92dce248be21","Type":"ContainerDied","Data":"1042ecea174ea12cc9988f514a93445658221f0356f271c9aa00b520ec02845e"} Apr 22 21:16:12.924855 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.924401 2570 scope.go:117] "RemoveContainer" containerID="2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7" Apr 22 21:16:12.925536 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.925507 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db6f5bcdb-lm526" event={"ID":"79e26af0-a55d-4e55-9bc8-26b7efbfc8ae","Type":"ContainerStarted","Data":"6925afb47d65a68a8d74af72ed27a202f6e7a63a0dbd9e9ee3a28fe98e699b38"} Apr 22 21:16:12.932175 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.932136 2570 scope.go:117] "RemoveContainer" containerID="2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7" Apr 22 21:16:12.932424 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:16:12.932407 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7\": container with ID starting with 2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7 not found: ID does not exist" containerID="2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7" Apr 22 21:16:12.932494 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.932436 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7"} err="failed to get container status \"2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7\": rpc error: code = NotFound desc = could not find container \"2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7\": container with ID starting with 2041fcdb58c59e04233de4ac16febc72b5a500b84d58a530234ba26542b565b7 not found: ID does not exist" Apr 22 21:16:12.938826 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.938803 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-b44d57d8d-wswwn"] Apr 22 21:16:12.942073 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:12.942044 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-b44d57d8d-wswwn"] Apr 22 21:16:13.930418 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:13.930382 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db6f5bcdb-lm526" event={"ID":"79e26af0-a55d-4e55-9bc8-26b7efbfc8ae","Type":"ContainerStarted","Data":"469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1"} Apr 22 21:16:13.930862 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:13.930439 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-db6f5bcdb-lm526" Apr 22 21:16:13.946575 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:13.946526 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-db6f5bcdb-lm526" podStartSLOduration=1.559196702 podStartE2EDuration="1.946510869s" podCreationTimestamp="2026-04-22 21:16:12 +0000 UTC" firstStartedPulling="2026-04-22 21:16:12.676152296 +0000 UTC m=+450.784963186" lastFinishedPulling="2026-04-22 21:16:13.063466457 +0000 UTC m=+451.172277353" observedRunningTime="2026-04-22 21:16:13.944236617 +0000 UTC m=+452.053047531" watchObservedRunningTime="2026-04-22 21:16:13.946510869 +0000 UTC m=+452.055321781" Apr 22 21:16:14.489161 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:14.489122 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5e53f50-9601-4b94-8f11-92dce248be21" path="/var/lib/kubelet/pods/f5e53f50-9601-4b94-8f11-92dce248be21/volumes" Apr 22 21:16:24.939896 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:24.939866 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-db6f5bcdb-lm526" Apr 22 21:16:24.974643 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:24.974611 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-cbw52"] Apr 22 21:16:24.974875 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:24.974838 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" podUID="2d2cd98a-b04e-4126-8be0-8b870d9e8701" containerName="manager" containerID="cri-o://93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106" gracePeriod=10 Apr 22 21:16:25.236713 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.236689 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" Apr 22 21:16:25.316853 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.316813 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk8nq\" (UniqueName: \"kubernetes.io/projected/2d2cd98a-b04e-4126-8be0-8b870d9e8701-kube-api-access-vk8nq\") pod \"2d2cd98a-b04e-4126-8be0-8b870d9e8701\" (UID: \"2d2cd98a-b04e-4126-8be0-8b870d9e8701\") " Apr 22 21:16:25.318889 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.318859 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2cd98a-b04e-4126-8be0-8b870d9e8701-kube-api-access-vk8nq" (OuterVolumeSpecName: "kube-api-access-vk8nq") pod "2d2cd98a-b04e-4126-8be0-8b870d9e8701" (UID: "2d2cd98a-b04e-4126-8be0-8b870d9e8701"). InnerVolumeSpecName "kube-api-access-vk8nq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:16:25.417757 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.417725 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vk8nq\" (UniqueName: \"kubernetes.io/projected/2d2cd98a-b04e-4126-8be0-8b870d9e8701-kube-api-access-vk8nq\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:16:25.968718 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.968683 2570 generic.go:358] "Generic (PLEG): container finished" podID="2d2cd98a-b04e-4126-8be0-8b870d9e8701" containerID="93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106" exitCode=0 Apr 22 21:16:25.969121 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.968747 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" Apr 22 21:16:25.969121 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.968763 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" event={"ID":"2d2cd98a-b04e-4126-8be0-8b870d9e8701","Type":"ContainerDied","Data":"93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106"} Apr 22 21:16:25.969121 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.968795 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c6497bbdb-cbw52" event={"ID":"2d2cd98a-b04e-4126-8be0-8b870d9e8701","Type":"ContainerDied","Data":"a931ded9f8739c90553c600f1648d8284b321343ccf53cec9385e574a0463abd"} Apr 22 21:16:25.969121 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.968810 2570 scope.go:117] "RemoveContainer" containerID="93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106" Apr 22 21:16:25.977374 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.977207 2570 scope.go:117] "RemoveContainer" containerID="93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106" Apr 22 21:16:25.977509 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:16:25.977491 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106\": container with ID starting with 93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106 not found: ID does not exist" containerID="93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106" Apr 22 21:16:25.977569 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.977524 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106"} err="failed to get container status \"93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106\": rpc error: code = NotFound desc = could not find container \"93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106\": container with ID starting with 93731325ca8a272d54fd21f34a5e81c992ec9f823479286ba7553569d0379106 not found: ID does not exist" Apr 22 21:16:25.992305 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.992280 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-cbw52"] Apr 22 21:16:25.994283 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:25.994263 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-cbw52"] Apr 22 21:16:26.489188 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:26.489156 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2cd98a-b04e-4126-8be0-8b870d9e8701" path="/var/lib/kubelet/pods/2d2cd98a-b04e-4126-8be0-8b870d9e8701/volumes" Apr 22 21:16:50.047990 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.047958 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7"] Apr 22 21:16:50.048407 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.048270 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d2cd98a-b04e-4126-8be0-8b870d9e8701" containerName="manager" Apr 22 21:16:50.048407 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.048282 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2cd98a-b04e-4126-8be0-8b870d9e8701" containerName="manager" Apr 22 21:16:50.048407 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.048344 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d2cd98a-b04e-4126-8be0-8b870d9e8701" containerName="manager" Apr 22 21:16:50.053476 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.053459 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.055823 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.055799 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 21:16:50.056864 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.056844 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-bhkfk\"" Apr 22 21:16:50.056983 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.056883 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 21:16:50.056983 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.056847 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 22 21:16:50.058632 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.058608 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7"] Apr 22 21:16:50.115002 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.114966 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.115002 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.115003 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.115211 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.115026 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.115211 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.115169 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnxk2\" (UniqueName: \"kubernetes.io/projected/e4708691-3472-4f12-96a3-fceba7d30f92-kube-api-access-vnxk2\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.115321 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.115232 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.115365 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.115328 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4708691-3472-4f12-96a3-fceba7d30f92-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.216603 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.216557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4708691-3472-4f12-96a3-fceba7d30f92-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.216603 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.216609 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.216804 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.216663 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.216804 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.216713 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.216922 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.216897 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnxk2\" (UniqueName: \"kubernetes.io/projected/e4708691-3472-4f12-96a3-fceba7d30f92-kube-api-access-vnxk2\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.216989 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.216972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.217079 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.217039 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.217192 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.217084 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.217192 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.217125 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.219068 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.219040 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e4708691-3472-4f12-96a3-fceba7d30f92-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.219145 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.219088 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4708691-3472-4f12-96a3-fceba7d30f92-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.224815 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.224789 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnxk2\" (UniqueName: \"kubernetes.io/projected/e4708691-3472-4f12-96a3-fceba7d30f92-kube-api-access-vnxk2\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-xt9k7\" (UID: \"e4708691-3472-4f12-96a3-fceba7d30f92\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.365977 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.365892 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:16:50.487294 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:16:50.487237 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4708691_3472_4f12_96a3_fceba7d30f92.slice/crio-489b641e5ddd6a1c902264eadb9ba3243f9153b6e10d5dad46b05857c4bcd4a5 WatchSource:0}: Error finding container 489b641e5ddd6a1c902264eadb9ba3243f9153b6e10d5dad46b05857c4bcd4a5: Status 404 returned error can't find the container with id 489b641e5ddd6a1c902264eadb9ba3243f9153b6e10d5dad46b05857c4bcd4a5 Apr 22 21:16:50.489229 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.489206 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7"] Apr 22 21:16:50.545096 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:50.545060 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:16:51.048650 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:51.048613 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" event={"ID":"e4708691-3472-4f12-96a3-fceba7d30f92","Type":"ContainerStarted","Data":"489b641e5ddd6a1c902264eadb9ba3243f9153b6e10d5dad46b05857c4bcd4a5"} Apr 22 21:16:53.535603 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:53.535570 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-65dcbbcf47-mw2ps"] Apr 22 21:16:53.536236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:53.536168 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" podUID="53767960-6700-4ee0-98e0-452669ba38ef" containerName="maas-api" containerID="cri-o://ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea" gracePeriod=30 Apr 22 21:16:53.795131 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:53.795059 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:53.847409 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:53.847372 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/53767960-6700-4ee0-98e0-452669ba38ef-maas-api-tls\") pod \"53767960-6700-4ee0-98e0-452669ba38ef\" (UID: \"53767960-6700-4ee0-98e0-452669ba38ef\") " Apr 22 21:16:53.847645 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:53.847459 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59mjk\" (UniqueName: \"kubernetes.io/projected/53767960-6700-4ee0-98e0-452669ba38ef-kube-api-access-59mjk\") pod \"53767960-6700-4ee0-98e0-452669ba38ef\" (UID: \"53767960-6700-4ee0-98e0-452669ba38ef\") " Apr 22 21:16:53.850126 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:53.850090 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53767960-6700-4ee0-98e0-452669ba38ef-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "53767960-6700-4ee0-98e0-452669ba38ef" (UID: "53767960-6700-4ee0-98e0-452669ba38ef"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:16:53.850279 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:53.850203 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53767960-6700-4ee0-98e0-452669ba38ef-kube-api-access-59mjk" (OuterVolumeSpecName: "kube-api-access-59mjk") pod "53767960-6700-4ee0-98e0-452669ba38ef" (UID: "53767960-6700-4ee0-98e0-452669ba38ef"). InnerVolumeSpecName "kube-api-access-59mjk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:16:53.948537 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:53.948499 2570 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/53767960-6700-4ee0-98e0-452669ba38ef-maas-api-tls\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:16:53.948537 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:53.948533 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-59mjk\" (UniqueName: \"kubernetes.io/projected/53767960-6700-4ee0-98e0-452669ba38ef-kube-api-access-59mjk\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:16:54.060895 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:54.060802 2570 generic.go:358] "Generic (PLEG): container finished" podID="53767960-6700-4ee0-98e0-452669ba38ef" containerID="ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea" exitCode=0 Apr 22 21:16:54.060895 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:54.060869 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" Apr 22 21:16:54.061104 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:54.060892 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" event={"ID":"53767960-6700-4ee0-98e0-452669ba38ef","Type":"ContainerDied","Data":"ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea"} Apr 22 21:16:54.061104 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:54.060940 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65dcbbcf47-mw2ps" event={"ID":"53767960-6700-4ee0-98e0-452669ba38ef","Type":"ContainerDied","Data":"241f3681806b28b80b36b3cee72ec56ffd0e201d03613146d587aad80905113a"} Apr 22 21:16:54.061104 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:54.060964 2570 scope.go:117] "RemoveContainer" containerID="ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea" Apr 22 21:16:54.070352 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:54.070326 2570 scope.go:117] "RemoveContainer" containerID="ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea" Apr 22 21:16:54.070664 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:16:54.070636 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea\": container with ID starting with ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea not found: ID does not exist" containerID="ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea" Apr 22 21:16:54.070769 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:54.070675 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea"} err="failed to get container status \"ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea\": rpc error: code = NotFound desc = could not find container \"ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea\": container with ID starting with ce333a6e11e95d94b454bedcc540105a0fa98c6d7ac14758ba7dec1888247bea not found: ID does not exist" Apr 22 21:16:54.083800 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:54.083778 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-65dcbbcf47-mw2ps"] Apr 22 21:16:54.087023 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:54.087000 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-65dcbbcf47-mw2ps"] Apr 22 21:16:54.490370 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:54.490339 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53767960-6700-4ee0-98e0-452669ba38ef" path="/var/lib/kubelet/pods/53767960-6700-4ee0-98e0-452669ba38ef/volumes" Apr 22 21:16:56.545655 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:56.545622 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:16:58.077219 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:16:58.077184 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" event={"ID":"e4708691-3472-4f12-96a3-fceba7d30f92","Type":"ContainerStarted","Data":"677b1622559c8bbe12f2f65b8396307078123cf3895226bf50d426f38d691359"} Apr 22 21:17:03.096110 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:03.096021 2570 generic.go:358] "Generic (PLEG): container finished" podID="e4708691-3472-4f12-96a3-fceba7d30f92" containerID="677b1622559c8bbe12f2f65b8396307078123cf3895226bf50d426f38d691359" exitCode=0 Apr 22 21:17:03.096110 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:03.096095 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" event={"ID":"e4708691-3472-4f12-96a3-fceba7d30f92","Type":"ContainerDied","Data":"677b1622559c8bbe12f2f65b8396307078123cf3895226bf50d426f38d691359"} Apr 22 21:17:05.104011 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:05.103978 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" event={"ID":"e4708691-3472-4f12-96a3-fceba7d30f92","Type":"ContainerStarted","Data":"87feff18801799cb18daa2a906653c25f1ac6fac75aba9a2a8de684345c821b6"} Apr 22 21:17:05.104413 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:05.104186 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:17:05.127441 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:05.127395 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" podStartSLOduration=1.394641082 podStartE2EDuration="15.127381079s" podCreationTimestamp="2026-04-22 21:16:50 +0000 UTC" firstStartedPulling="2026-04-22 21:16:50.489078572 +0000 UTC m=+488.597889463" lastFinishedPulling="2026-04-22 21:17:04.221818569 +0000 UTC m=+502.330629460" observedRunningTime="2026-04-22 21:17:05.126435337 +0000 UTC m=+503.235246253" watchObservedRunningTime="2026-04-22 21:17:05.127381079 +0000 UTC m=+503.236191992" Apr 22 21:17:09.226176 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.226138 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr"] Apr 22 21:17:09.226565 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.226442 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53767960-6700-4ee0-98e0-452669ba38ef" containerName="maas-api" Apr 22 21:17:09.226565 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.226453 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="53767960-6700-4ee0-98e0-452669ba38ef" containerName="maas-api" Apr 22 21:17:09.226565 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.226512 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="53767960-6700-4ee0-98e0-452669ba38ef" containerName="maas-api" Apr 22 21:17:09.230545 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.230525 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.232754 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.232731 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 22 21:17:09.239223 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.239201 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr"] Apr 22 21:17:09.381959 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.381927 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0cd7b77-8550-4fa3-be6b-8b01de03626c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.382137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.381968 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.382137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.382033 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7ldn\" (UniqueName: \"kubernetes.io/projected/e0cd7b77-8550-4fa3-be6b-8b01de03626c-kube-api-access-k7ldn\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.382137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.382065 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.382137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.382088 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.382137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.382122 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.483328 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.483229 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.483328 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.483301 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0cd7b77-8550-4fa3-be6b-8b01de03626c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.483553 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.483328 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.483553 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.483374 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7ldn\" (UniqueName: \"kubernetes.io/projected/e0cd7b77-8550-4fa3-be6b-8b01de03626c-kube-api-access-k7ldn\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.483553 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.483417 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.483553 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.483441 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.483820 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.483677 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.483820 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.483787 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.483919 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.483851 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.485692 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.485673 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e0cd7b77-8550-4fa3-be6b-8b01de03626c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.485828 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.485810 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e0cd7b77-8550-4fa3-be6b-8b01de03626c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.490632 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.490606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7ldn\" (UniqueName: \"kubernetes.io/projected/e0cd7b77-8550-4fa3-be6b-8b01de03626c-kube-api-access-k7ldn\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr\" (UID: \"e0cd7b77-8550-4fa3-be6b-8b01de03626c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.540619 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.540583 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:09.665734 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:09.665703 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr"] Apr 22 21:17:09.667279 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:17:09.667225 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0cd7b77_8550_4fa3_be6b_8b01de03626c.slice/crio-90efedcb508f32502e39edf8e558e12f100185cdbfb83a4ad5dfe98ac99218ad WatchSource:0}: Error finding container 90efedcb508f32502e39edf8e558e12f100185cdbfb83a4ad5dfe98ac99218ad: Status 404 returned error can't find the container with id 90efedcb508f32502e39edf8e558e12f100185cdbfb83a4ad5dfe98ac99218ad Apr 22 21:17:10.121787 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:10.121752 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" event={"ID":"e0cd7b77-8550-4fa3-be6b-8b01de03626c","Type":"ContainerStarted","Data":"622cda3edb8afd65119390d744665375c0a0f9f73d34146f0e7a197dd9d501cd"} Apr 22 21:17:10.121787 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:10.121788 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" event={"ID":"e0cd7b77-8550-4fa3-be6b-8b01de03626c","Type":"ContainerStarted","Data":"90efedcb508f32502e39edf8e558e12f100185cdbfb83a4ad5dfe98ac99218ad"} Apr 22 21:17:10.142078 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:10.142046 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:17:15.141319 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:15.141286 2570 generic.go:358] "Generic (PLEG): container finished" podID="e0cd7b77-8550-4fa3-be6b-8b01de03626c" containerID="622cda3edb8afd65119390d744665375c0a0f9f73d34146f0e7a197dd9d501cd" exitCode=0 Apr 22 21:17:15.141832 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:15.141360 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" event={"ID":"e0cd7b77-8550-4fa3-be6b-8b01de03626c","Type":"ContainerDied","Data":"622cda3edb8afd65119390d744665375c0a0f9f73d34146f0e7a197dd9d501cd"} Apr 22 21:17:16.120363 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:16.120331 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-xt9k7" Apr 22 21:17:16.146709 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:16.146674 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" event={"ID":"e0cd7b77-8550-4fa3-be6b-8b01de03626c","Type":"ContainerStarted","Data":"34f30c9caaa18b24099bbbd6b0e5f1e66caeaf6d19789e2e8acf85bef2a2d951"} Apr 22 21:17:16.147266 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:16.147228 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:16.164370 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:16.164327 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" podStartSLOduration=6.940170375 podStartE2EDuration="7.164312941s" podCreationTimestamp="2026-04-22 21:17:09 +0000 UTC" firstStartedPulling="2026-04-22 21:17:15.142054013 +0000 UTC m=+513.250864904" lastFinishedPulling="2026-04-22 21:17:15.366196575 +0000 UTC m=+513.475007470" observedRunningTime="2026-04-22 21:17:16.162759139 +0000 UTC m=+514.271570052" watchObservedRunningTime="2026-04-22 21:17:16.164312941 +0000 UTC m=+514.273123854" Apr 22 21:17:18.030137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.030105 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52"] Apr 22 21:17:18.056476 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.056397 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52"] Apr 22 21:17:18.056713 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.056693 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.059233 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.059194 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 22 21:17:18.157734 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.157703 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.157734 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.157736 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.157954 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.157757 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn4bb\" (UniqueName: \"kubernetes.io/projected/00ac352a-d689-494c-95de-0261a2575027-kube-api-access-bn4bb\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.157954 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.157822 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.157954 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.157892 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.157954 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.157920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00ac352a-d689-494c-95de-0261a2575027-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.258597 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.258557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.258597 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.258595 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00ac352a-d689-494c-95de-0261a2575027-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.258863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.258622 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.258863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.258643 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.258863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.258662 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn4bb\" (UniqueName: \"kubernetes.io/projected/00ac352a-d689-494c-95de-0261a2575027-kube-api-access-bn4bb\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.258863 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.258691 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.259088 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.259062 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.259137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.259102 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.259137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.259075 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.261025 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.260999 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00ac352a-d689-494c-95de-0261a2575027-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.261244 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.261226 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00ac352a-d689-494c-95de-0261a2575027-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.266187 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.266161 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn4bb\" (UniqueName: \"kubernetes.io/projected/00ac352a-d689-494c-95de-0261a2575027-kube-api-access-bn4bb\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52\" (UID: \"00ac352a-d689-494c-95de-0261a2575027\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.367516 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.367421 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:18.489960 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:17:18.489924 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00ac352a_d689_494c_95de_0261a2575027.slice/crio-2e6b99f8023ec7d9cc60c54cea530a510404641320a5abfaa24574ddd362f4cf WatchSource:0}: Error finding container 2e6b99f8023ec7d9cc60c54cea530a510404641320a5abfaa24574ddd362f4cf: Status 404 returned error can't find the container with id 2e6b99f8023ec7d9cc60c54cea530a510404641320a5abfaa24574ddd362f4cf Apr 22 21:17:18.491506 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:18.491481 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52"] Apr 22 21:17:19.138523 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:19.138486 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:17:19.159433 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:19.159394 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" event={"ID":"00ac352a-d689-494c-95de-0261a2575027","Type":"ContainerStarted","Data":"31a46724001a198259b962e6d8264c4942ebd8cf49f19884edd4f2804c00ab7d"} Apr 22 21:17:19.159433 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:19.159434 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" event={"ID":"00ac352a-d689-494c-95de-0261a2575027","Type":"ContainerStarted","Data":"2e6b99f8023ec7d9cc60c54cea530a510404641320a5abfaa24574ddd362f4cf"} Apr 22 21:17:20.142677 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.142640 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c"] Apr 22 21:17:20.147088 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.147067 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.149225 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.149205 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 22 21:17:20.156659 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.156630 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c"] Apr 22 21:17:20.276679 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.276641 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.276846 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.276739 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.276846 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.276768 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.276961 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.276942 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.277002 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.276990 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.277048 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.277020 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv8c5\" (UniqueName: \"kubernetes.io/projected/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-kube-api-access-bv8c5\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.378579 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.378540 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.378773 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.378612 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.378773 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.378646 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bv8c5\" (UniqueName: \"kubernetes.io/projected/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-kube-api-access-bv8c5\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.378773 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.378707 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.378773 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.378764 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.379013 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.378821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.379077 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.379057 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.379283 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.379238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.379346 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.379288 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.380948 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.380913 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.381237 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.381218 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.386406 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.386380 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv8c5\" (UniqueName: \"kubernetes.io/projected/d8fb9cad-bd99-42ee-a83b-3a346f59bdb8-kube-api-access-bv8c5\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c\" (UID: \"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.457174 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.457138 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:20.578905 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:20.578871 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c"] Apr 22 21:17:20.581896 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:17:20.581868 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8fb9cad_bd99_42ee_a83b_3a346f59bdb8.slice/crio-1a655dac266571e5e959bb8f20ca14c0f1f13a655829129f9bb5bd31a16bfb4b WatchSource:0}: Error finding container 1a655dac266571e5e959bb8f20ca14c0f1f13a655829129f9bb5bd31a16bfb4b: Status 404 returned error can't find the container with id 1a655dac266571e5e959bb8f20ca14c0f1f13a655829129f9bb5bd31a16bfb4b Apr 22 21:17:21.167488 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:21.167450 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" event={"ID":"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8","Type":"ContainerStarted","Data":"fe9bb940f455477bdaf5a16f99a3b198c73c7a36ef3c2f5a057f8e2829ddd05d"} Apr 22 21:17:21.167867 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:21.167495 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" event={"ID":"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8","Type":"ContainerStarted","Data":"1a655dac266571e5e959bb8f20ca14c0f1f13a655829129f9bb5bd31a16bfb4b"} Apr 22 21:17:21.945935 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:21.945890 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:17:24.179739 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:24.179693 2570 generic.go:358] "Generic (PLEG): container finished" podID="00ac352a-d689-494c-95de-0261a2575027" containerID="31a46724001a198259b962e6d8264c4942ebd8cf49f19884edd4f2804c00ab7d" exitCode=0 Apr 22 21:17:24.180093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:24.179765 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" event={"ID":"00ac352a-d689-494c-95de-0261a2575027","Type":"ContainerDied","Data":"31a46724001a198259b962e6d8264c4942ebd8cf49f19884edd4f2804c00ab7d"} Apr 22 21:17:25.185331 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:25.185300 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" event={"ID":"00ac352a-d689-494c-95de-0261a2575027","Type":"ContainerStarted","Data":"c6bd4ee1d4b00cfb7a576972e7abe1f939aa2177a6b557a597fc14d2c7bbdaf5"} Apr 22 21:17:25.185799 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:25.185537 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:25.203170 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:25.203118 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" podStartSLOduration=6.920856269 podStartE2EDuration="7.203104019s" podCreationTimestamp="2026-04-22 21:17:18 +0000 UTC" firstStartedPulling="2026-04-22 21:17:24.180397237 +0000 UTC m=+522.289208127" lastFinishedPulling="2026-04-22 21:17:24.462644985 +0000 UTC m=+522.571455877" observedRunningTime="2026-04-22 21:17:25.201165795 +0000 UTC m=+523.309976732" watchObservedRunningTime="2026-04-22 21:17:25.203104019 +0000 UTC m=+523.311914931" Apr 22 21:17:27.164508 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:27.164470 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr" Apr 22 21:17:29.201419 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:29.201385 2570 generic.go:358] "Generic (PLEG): container finished" podID="d8fb9cad-bd99-42ee-a83b-3a346f59bdb8" containerID="fe9bb940f455477bdaf5a16f99a3b198c73c7a36ef3c2f5a057f8e2829ddd05d" exitCode=0 Apr 22 21:17:29.201904 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:29.201460 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" event={"ID":"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8","Type":"ContainerDied","Data":"fe9bb940f455477bdaf5a16f99a3b198c73c7a36ef3c2f5a057f8e2829ddd05d"} Apr 22 21:17:30.206638 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:30.206608 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" event={"ID":"d8fb9cad-bd99-42ee-a83b-3a346f59bdb8","Type":"ContainerStarted","Data":"4a9baa8a8dbf0478dd85db592d88a1877ceb7c214c8b29e535ca9752477ee3d4"} Apr 22 21:17:30.207011 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:30.206801 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:30.222135 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:30.222081 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" podStartSLOduration=9.985331364 podStartE2EDuration="10.22206357s" podCreationTimestamp="2026-04-22 21:17:20 +0000 UTC" firstStartedPulling="2026-04-22 21:17:29.202219733 +0000 UTC m=+527.311030624" lastFinishedPulling="2026-04-22 21:17:29.438951938 +0000 UTC m=+527.547762830" observedRunningTime="2026-04-22 21:17:30.221676976 +0000 UTC m=+528.330487912" watchObservedRunningTime="2026-04-22 21:17:30.22206357 +0000 UTC m=+528.330874485" Apr 22 21:17:36.201874 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:36.201797 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52" Apr 22 21:17:41.223543 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:41.223506 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c" Apr 22 21:17:48.851361 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:17:48.851325 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:18:42.434005 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:18:42.433978 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:18:42.435496 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:18:42.435474 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:18:45.750874 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:18:45.750841 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:18:56.545351 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:18:56.545315 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:19:05.637363 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:05.637280 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:19:15.538765 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:15.538732 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:19:25.342131 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:25.342096 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:19:35.640868 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:35.640832 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:19:42.173278 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.173228 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-db6f5bcdb-lm526"] Apr 22 21:19:42.173672 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.173480 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-db6f5bcdb-lm526" podUID="79e26af0-a55d-4e55-9bc8-26b7efbfc8ae" containerName="manager" containerID="cri-o://469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1" gracePeriod=10 Apr 22 21:19:42.414213 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.414190 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db6f5bcdb-lm526" Apr 22 21:19:42.500476 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.500449 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84pbt\" (UniqueName: \"kubernetes.io/projected/79e26af0-a55d-4e55-9bc8-26b7efbfc8ae-kube-api-access-84pbt\") pod \"79e26af0-a55d-4e55-9bc8-26b7efbfc8ae\" (UID: \"79e26af0-a55d-4e55-9bc8-26b7efbfc8ae\") " Apr 22 21:19:42.502344 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.502317 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e26af0-a55d-4e55-9bc8-26b7efbfc8ae-kube-api-access-84pbt" (OuterVolumeSpecName: "kube-api-access-84pbt") pod "79e26af0-a55d-4e55-9bc8-26b7efbfc8ae" (UID: "79e26af0-a55d-4e55-9bc8-26b7efbfc8ae"). InnerVolumeSpecName "kube-api-access-84pbt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:19:42.601970 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.601936 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-84pbt\" (UniqueName: \"kubernetes.io/projected/79e26af0-a55d-4e55-9bc8-26b7efbfc8ae-kube-api-access-84pbt\") on node \"ip-10-0-130-19.ec2.internal\" DevicePath \"\"" Apr 22 21:19:42.641459 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.641426 2570 generic.go:358] "Generic (PLEG): container finished" podID="79e26af0-a55d-4e55-9bc8-26b7efbfc8ae" containerID="469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1" exitCode=0 Apr 22 21:19:42.641615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.641480 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db6f5bcdb-lm526" event={"ID":"79e26af0-a55d-4e55-9bc8-26b7efbfc8ae","Type":"ContainerDied","Data":"469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1"} Apr 22 21:19:42.641615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.641504 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db6f5bcdb-lm526" event={"ID":"79e26af0-a55d-4e55-9bc8-26b7efbfc8ae","Type":"ContainerDied","Data":"6925afb47d65a68a8d74af72ed27a202f6e7a63a0dbd9e9ee3a28fe98e699b38"} Apr 22 21:19:42.641615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.641509 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db6f5bcdb-lm526" Apr 22 21:19:42.641615 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.641518 2570 scope.go:117] "RemoveContainer" containerID="469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1" Apr 22 21:19:42.649531 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.649401 2570 scope.go:117] "RemoveContainer" containerID="469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1" Apr 22 21:19:42.649727 ip-10-0-130-19 kubenswrapper[2570]: E0422 21:19:42.649700 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1\": container with ID starting with 469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1 not found: ID does not exist" containerID="469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1" Apr 22 21:19:42.649801 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.649737 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1"} err="failed to get container status \"469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1\": rpc error: code = NotFound desc = could not find container \"469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1\": container with ID starting with 469670b3b0da0597ba8992fb8d5a7fcddadf66c8198315ec4dbbe9c175f7c4c1 not found: ID does not exist" Apr 22 21:19:42.660981 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.660955 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-db6f5bcdb-lm526"] Apr 22 21:19:42.664879 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:42.664858 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-db6f5bcdb-lm526"] Apr 22 21:19:43.372866 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.372832 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-db6f5bcdb-kxf9w"] Apr 22 21:19:43.373239 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.373138 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79e26af0-a55d-4e55-9bc8-26b7efbfc8ae" containerName="manager" Apr 22 21:19:43.373239 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.373148 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e26af0-a55d-4e55-9bc8-26b7efbfc8ae" containerName="manager" Apr 22 21:19:43.373239 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.373202 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="79e26af0-a55d-4e55-9bc8-26b7efbfc8ae" containerName="manager" Apr 22 21:19:43.377309 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.377288 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db6f5bcdb-kxf9w" Apr 22 21:19:43.379409 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.379388 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-xw8js\"" Apr 22 21:19:43.383178 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.383153 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-db6f5bcdb-kxf9w"] Apr 22 21:19:43.509045 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.509013 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24rwd\" (UniqueName: \"kubernetes.io/projected/5b9ae69f-e4cb-419e-ab51-f2edda4e268f-kube-api-access-24rwd\") pod \"maas-controller-db6f5bcdb-kxf9w\" (UID: \"5b9ae69f-e4cb-419e-ab51-f2edda4e268f\") " pod="opendatahub/maas-controller-db6f5bcdb-kxf9w" Apr 22 21:19:43.610139 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.610105 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24rwd\" (UniqueName: \"kubernetes.io/projected/5b9ae69f-e4cb-419e-ab51-f2edda4e268f-kube-api-access-24rwd\") pod \"maas-controller-db6f5bcdb-kxf9w\" (UID: \"5b9ae69f-e4cb-419e-ab51-f2edda4e268f\") " pod="opendatahub/maas-controller-db6f5bcdb-kxf9w" Apr 22 21:19:43.617435 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.617411 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24rwd\" (UniqueName: \"kubernetes.io/projected/5b9ae69f-e4cb-419e-ab51-f2edda4e268f-kube-api-access-24rwd\") pod \"maas-controller-db6f5bcdb-kxf9w\" (UID: \"5b9ae69f-e4cb-419e-ab51-f2edda4e268f\") " pod="opendatahub/maas-controller-db6f5bcdb-kxf9w" Apr 22 21:19:43.688164 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:43.688137 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db6f5bcdb-kxf9w" Apr 22 21:19:44.012664 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:44.012641 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-db6f5bcdb-kxf9w"] Apr 22 21:19:44.015447 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:19:44.015416 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b9ae69f_e4cb_419e_ab51_f2edda4e268f.slice/crio-92bdd3e75bbe3f2bb69eba32a73ec3912cc56a1d6b214e587952e8a242039749 WatchSource:0}: Error finding container 92bdd3e75bbe3f2bb69eba32a73ec3912cc56a1d6b214e587952e8a242039749: Status 404 returned error can't find the container with id 92bdd3e75bbe3f2bb69eba32a73ec3912cc56a1d6b214e587952e8a242039749 Apr 22 21:19:44.016552 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:44.016529 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:19:44.489867 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:44.489836 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e26af0-a55d-4e55-9bc8-26b7efbfc8ae" path="/var/lib/kubelet/pods/79e26af0-a55d-4e55-9bc8-26b7efbfc8ae/volumes" Apr 22 21:19:44.650993 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:44.650900 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db6f5bcdb-kxf9w" event={"ID":"5b9ae69f-e4cb-419e-ab51-f2edda4e268f","Type":"ContainerStarted","Data":"3735b29c63be0347566944a20dd5209d4ce2b6fc1f350dd6507c934fd1947223"} Apr 22 21:19:44.650993 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:44.650947 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db6f5bcdb-kxf9w" event={"ID":"5b9ae69f-e4cb-419e-ab51-f2edda4e268f","Type":"ContainerStarted","Data":"92bdd3e75bbe3f2bb69eba32a73ec3912cc56a1d6b214e587952e8a242039749"} Apr 22 21:19:44.650993 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:44.650977 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-db6f5bcdb-kxf9w" Apr 22 21:19:44.666393 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:44.666347 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-db6f5bcdb-kxf9w" podStartSLOduration=1.287932283 podStartE2EDuration="1.666336041s" podCreationTimestamp="2026-04-22 21:19:43 +0000 UTC" firstStartedPulling="2026-04-22 21:19:44.016660855 +0000 UTC m=+662.125471746" lastFinishedPulling="2026-04-22 21:19:44.3950646 +0000 UTC m=+662.503875504" observedRunningTime="2026-04-22 21:19:44.665007978 +0000 UTC m=+662.773818892" watchObservedRunningTime="2026-04-22 21:19:44.666336041 +0000 UTC m=+662.775146953" Apr 22 21:19:55.659323 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:19:55.659293 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-db6f5bcdb-kxf9w" Apr 22 21:20:37.241848 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:20:37.241769 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:20:52.237009 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:20:52.236975 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:21:30.136393 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:21:30.136352 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:21:47.242786 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:21:47.242747 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:22:01.641150 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:22:01.641114 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:22:17.738306 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:22:17.738267 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:22:47.244792 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:22:47.244758 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:22:51.834191 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:22:51.834156 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:23:12.933755 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:23:12.933720 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:23:22.845553 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:23:22.845515 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:23:39.743193 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:23:39.743161 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:23:42.456954 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:23:42.456925 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:23:42.459669 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:23:42.459649 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:23:48.382878 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:23:48.382842 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:24:05.038032 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:24:05.037998 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:24:13.240992 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:24:13.240955 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:24:45.936430 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:24:45.936394 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:24:53.837952 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:24:53.837913 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:25:02.543783 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:25:02.543707 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:25:11.044293 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:25:11.044238 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:25:19.434751 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:25:19.434720 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:25:36.441360 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:25:36.441324 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:25:49.451216 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:25:49.451174 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:26:36.048840 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:26:36.048759 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:26:44.436869 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:26:44.436835 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:26:53.738231 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:26:53.738193 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:27:01.341577 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:27:01.341537 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:27:10.843001 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:27:10.842968 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:27:19.247448 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:27:19.247409 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:27:28.640163 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:27:28.640125 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:27:36.139359 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:27:36.139323 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:27:45.445096 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:27:45.445066 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:27:53.740139 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:27:53.740106 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:28:03.143670 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:28:03.143583 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:28:12.039530 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:28:12.039497 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:28:20.942138 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:28:20.942102 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:28:29.143361 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:28:29.143327 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:28:38.337703 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:28:38.337663 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:28:42.478326 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:28:42.478298 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:28:42.481616 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:28:42.481596 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:28:46.349933 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:28:46.349899 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:28:55.444570 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:28:55.444536 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:29:03.837425 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:29:03.837389 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:31:33.537548 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:31:33.537515 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:31:38.842420 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:31:38.842388 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:32:04.342187 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:32:04.342150 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:32:11.043236 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:32:11.043202 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:32:21.034682 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:32:21.034646 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:32:31.737998 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:32:31.737920 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:32:40.243510 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:32:40.243476 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:32:51.139333 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:32:51.139299 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:32:58.943259 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:32:58.943220 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:33:10.142451 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:33:10.142418 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:33:18.139596 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:33:18.139559 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:33:29.333024 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:33:29.332986 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:33:37.640268 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:33:37.640226 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:33:42.500293 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:33:42.500266 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:33:42.504206 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:33:42.504189 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:33:43.336667 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:33:43.336634 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:34:11.540784 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:34:11.540708 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:34:54.243474 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:34:54.243437 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:35:02.247099 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:35:02.247060 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:35:11.242462 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:35:11.242429 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:35:19.840096 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:35:19.840058 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:35:28.248894 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:35:28.248860 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:35:41.540885 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:35:41.540806 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:35:49.546738 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:35:49.546702 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:35:57.535806 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:35:57.535770 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:36:06.739498 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:36:06.739458 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:36:13.940461 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:36:13.940427 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:36:22.636708 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:36:22.636676 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:36:32.934446 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:36:32.934416 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:36:50.840527 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:36:50.840491 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:36:59.146834 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:36:59.146801 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:37:07.947511 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:37:07.947431 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:37:16.134933 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:37:16.134897 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:37:33.041156 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:37:33.041122 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:37:41.340446 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:37:41.340409 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:37:50.040629 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:37:50.040586 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:37:58.242735 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:37:58.242703 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:38:08.538471 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:38:08.538439 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:38:16.941890 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:38:16.941856 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:38:26.042006 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:38:26.041970 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:38:35.934067 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:38:35.933986 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:38:42.520329 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:38:42.520296 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:38:42.531709 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:38:42.531686 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:38:44.844710 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:38:44.844676 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:38:56.638093 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:38:56.638056 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:39:05.545893 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:39:05.545857 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:39:13.650367 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:39:13.650330 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:39:22.134586 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:39:22.134550 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:39:31.437774 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:39:31.437734 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:39:49.144219 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:39:49.144185 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:39:57.443868 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:39:57.443837 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:40:06.548772 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:06.548692 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:40:14.343522 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:14.343479 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:40:37.836040 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:37.836005 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:40:50.167561 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:50.167518 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kr4vc"] Apr 22 21:40:56.598815 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:56.598778 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-db6f5bcdb-kxf9w_5b9ae69f-e4cb-419e-ab51-f2edda4e268f/manager/0.log" Apr 22 21:40:57.055009 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:57.054972 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-754bfc4657-nnr4t_82c088f5-8d4c-4aed-9424-eebf30592f8f/manager/0.log" Apr 22 21:40:58.730632 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:58.730602 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-d9hft_29d9d6f1-be0b-4aa1-9588-49353ca77887/kuadrant-console-plugin/0.log" Apr 22 21:40:58.842037 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:58.842007 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-ngbgq_1eb452c2-c921-4091-ab91-de530abb6130/registry-server/0.log" Apr 22 21:40:59.084638 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:59.084562 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-kr4vc_b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4/limitador/0.log" Apr 22 21:40:59.536144 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:59.536119 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fkv5fv_04aebafb-0223-4fea-b000-baf860d9b3b7/istio-proxy/0.log" Apr 22 21:40:59.982295 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:40:59.982264 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-7slzq_684e367c-ce68-4452-b7f4-9d7004a05e85/istio-proxy/0.log" Apr 22 21:41:00.425617 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:00.425541 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52_00ac352a-d689-494c-95de-0261a2575027/storage-initializer/0.log" Apr 22 21:41:00.437791 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:00.437769 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-s9c52_00ac352a-d689-494c-95de-0261a2575027/main/0.log" Apr 22 21:41:00.658875 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:00.658831 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-xt9k7_e4708691-3472-4f12-96a3-fceba7d30f92/storage-initializer/0.log" Apr 22 21:41:00.664792 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:00.664774 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-xt9k7_e4708691-3472-4f12-96a3-fceba7d30f92/main/0.log" Apr 22 21:41:00.775501 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:00.775474 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr_e0cd7b77-8550-4fa3-be6b-8b01de03626c/storage-initializer/0.log" Apr 22 21:41:00.783740 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:00.783717 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc7rftr_e0cd7b77-8550-4fa3-be6b-8b01de03626c/main/0.log" Apr 22 21:41:01.016071 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:01.016046 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c_d8fb9cad-bd99-42ee-a83b-3a346f59bdb8/storage-initializer/0.log" Apr 22 21:41:01.023565 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:01.023536 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-2gn8c_d8fb9cad-bd99-42ee-a83b-3a346f59bdb8/main/0.log" Apr 22 21:41:07.989727 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:07.989691 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-jbwgj_467caf5c-14f4-4489-a131-5028add687dc/global-pull-secret-syncer/0.log" Apr 22 21:41:08.145354 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:08.145320 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fxk2q_79e5edd2-847e-4f99-9bd9-f7ba3a94cd4e/konnectivity-agent/0.log" Apr 22 21:41:08.209209 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:08.209172 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-19.ec2.internal_a5e99fc6db543cf6951686e44ee274cc/haproxy/0.log" Apr 22 21:41:12.541926 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:12.541884 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-d9hft_29d9d6f1-be0b-4aa1-9588-49353ca77887/kuadrant-console-plugin/0.log" Apr 22 21:41:12.578095 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:12.578052 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-ngbgq_1eb452c2-c921-4091-ab91-de530abb6130/registry-server/0.log" Apr 22 21:41:12.698430 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:12.698391 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-kr4vc_b4cfd43b-fb04-4855-b4b9-0dbbd49c42b4/limitador/0.log" Apr 22 21:41:14.620524 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:14.620496 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-46ljz_d197d611-b70d-4786-b1c4-59fd6632ebb9/node-exporter/0.log" Apr 22 21:41:14.640010 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:14.639983 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-46ljz_d197d611-b70d-4786-b1c4-59fd6632ebb9/kube-rbac-proxy/0.log" Apr 22 21:41:14.660629 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:14.660608 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-46ljz_d197d611-b70d-4786-b1c4-59fd6632ebb9/init-textfile/0.log" Apr 22 21:41:15.115867 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:15.115834 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-fsbct_c27159ec-9cf6-4f65-a975-c4509499046f/prometheus-operator-admission-webhook/0.log" Apr 22 21:41:16.519906 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.519869 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6"] Apr 22 21:41:16.523137 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.523117 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.525501 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.525476 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nnb7p\"/\"kube-root-ca.crt\"" Apr 22 21:41:16.525622 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.525521 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nnb7p\"/\"default-dockercfg-2cdrj\"" Apr 22 21:41:16.526390 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.526377 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nnb7p\"/\"openshift-service-ca.crt\"" Apr 22 21:41:16.532701 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.532678 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6"] Apr 22 21:41:16.699533 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.699500 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-sys\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.699739 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.699555 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-podres\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.699739 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.699644 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rk5v\" (UniqueName: \"kubernetes.io/projected/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-kube-api-access-9rk5v\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.699739 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.699699 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-proc\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.699739 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.699733 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-lib-modules\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.801044 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.800961 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-sys\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.801044 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.801011 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-podres\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.801044 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.801033 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rk5v\" (UniqueName: \"kubernetes.io/projected/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-kube-api-access-9rk5v\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.801288 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.801054 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-proc\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.801288 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.801070 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-lib-modules\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.801288 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.801094 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-sys\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.801288 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.801192 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-proc\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.801288 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.801203 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-lib-modules\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.801288 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.801205 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-podres\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.809237 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.809209 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rk5v\" (UniqueName: \"kubernetes.io/projected/48cc09dd-1d01-414c-8d2d-09f8a0be3c61-kube-api-access-9rk5v\") pod \"perf-node-gather-daemonset-78gv6\" (UID: \"48cc09dd-1d01-414c-8d2d-09f8a0be3c61\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.833166 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.833146 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:16.956460 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.956434 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6"] Apr 22 21:41:16.959117 ip-10-0-130-19 kubenswrapper[2570]: W0422 21:41:16.959087 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod48cc09dd_1d01_414c_8d2d_09f8a0be3c61.slice/crio-9c87ef777b88b335e39413e83a83e98f1b6a8207e868ced268c060085a89b3f9 WatchSource:0}: Error finding container 9c87ef777b88b335e39413e83a83e98f1b6a8207e868ced268c060085a89b3f9: Status 404 returned error can't find the container with id 9c87ef777b88b335e39413e83a83e98f1b6a8207e868ced268c060085a89b3f9 Apr 22 21:41:16.960935 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:16.960916 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:41:17.346374 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:17.346295 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-kz6zz_7ed62681-dcb6-451c-970e-c5b940202a6b/download-server/0.log" Apr 22 21:41:17.821831 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:17.821800 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" event={"ID":"48cc09dd-1d01-414c-8d2d-09f8a0be3c61","Type":"ContainerStarted","Data":"659ee7abba598d285e603c49c1bd30cc3354c063fd7a4fb45d76c96786aee253"} Apr 22 21:41:17.821831 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:17.821834 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" event={"ID":"48cc09dd-1d01-414c-8d2d-09f8a0be3c61","Type":"ContainerStarted","Data":"9c87ef777b88b335e39413e83a83e98f1b6a8207e868ced268c060085a89b3f9"} Apr 22 21:41:17.822286 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:17.821860 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:17.835946 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:17.835900 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" podStartSLOduration=1.835882873 podStartE2EDuration="1.835882873s" podCreationTimestamp="2026-04-22 21:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:41:17.835554192 +0000 UTC m=+1955.944365110" watchObservedRunningTime="2026-04-22 21:41:17.835882873 +0000 UTC m=+1955.944693789" Apr 22 21:41:18.645790 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:18.645764 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5nj7r_570e8677-7a14-41e1-af96-2344f7ef5d3a/dns/0.log" Apr 22 21:41:18.663949 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:18.663924 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5nj7r_570e8677-7a14-41e1-af96-2344f7ef5d3a/kube-rbac-proxy/0.log" Apr 22 21:41:18.728225 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:18.728196 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5tcwn_ac2a064a-e64c-4f46-aa0e-e1056872e044/dns-node-resolver/0.log" Apr 22 21:41:19.210507 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:19.210482 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-55c565d499-2vp9n_dd1972a0-df78-46e1-adb6-9b8e74b0f9f1/registry/0.log" Apr 22 21:41:19.229229 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:19.229209 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4wjxv_400a6d5d-3d9c-4307-9701-895aad7b37b7/node-ca/0.log" Apr 22 21:41:20.004797 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:20.004768 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fkv5fv_04aebafb-0223-4fea-b000-baf860d9b3b7/istio-proxy/0.log" Apr 22 21:41:20.269887 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:20.269812 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-7slzq_684e367c-ce68-4452-b7f4-9d7004a05e85/istio-proxy/0.log" Apr 22 21:41:20.743555 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:20.743528 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2zb4s_070e50f3-495a-4586-b0bd-a251eb98bccc/serve-healthcheck-canary/0.log" Apr 22 21:41:21.294642 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:21.294619 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m4m5v_97d77e34-3326-4a42-96bb-659316f50103/kube-rbac-proxy/0.log" Apr 22 21:41:21.318764 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:21.318737 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m4m5v_97d77e34-3326-4a42-96bb-659316f50103/exporter/0.log" Apr 22 21:41:21.337924 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:21.337906 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m4m5v_97d77e34-3326-4a42-96bb-659316f50103/extractor/0.log" Apr 22 21:41:23.332686 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:23.332657 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-db6f5bcdb-kxf9w_5b9ae69f-e4cb-419e-ab51-f2edda4e268f/manager/0.log" Apr 22 21:41:23.464241 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:23.464213 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-754bfc4657-nnr4t_82c088f5-8d4c-4aed-9424-eebf30592f8f/manager/0.log" Apr 22 21:41:23.835300 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:23.835271 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-78gv6" Apr 22 21:41:24.567691 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:24.567657 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7979f84667-j9fqk_2a247d13-757d-4f7b-9e3e-d9ba0fec288c/manager/0.log" Apr 22 21:41:29.000046 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:29.000019 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-6mgmm_32a4c04c-bb6b-4ecb-8790-7acaad9a70f1/migrator/0.log" Apr 22 21:41:29.043985 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:29.043959 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-6mgmm_32a4c04c-bb6b-4ecb-8790-7acaad9a70f1/graceful-termination/0.log" Apr 22 21:41:30.683386 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:30.683360 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb9x8_eb7a4eac-7e6d-40ce-abb1-594e34fb2571/kube-multus-additional-cni-plugins/0.log" Apr 22 21:41:30.702335 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:30.702313 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb9x8_eb7a4eac-7e6d-40ce-abb1-594e34fb2571/egress-router-binary-copy/0.log" Apr 22 21:41:30.720321 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:30.720301 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb9x8_eb7a4eac-7e6d-40ce-abb1-594e34fb2571/cni-plugins/0.log" Apr 22 21:41:30.740793 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:30.740776 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb9x8_eb7a4eac-7e6d-40ce-abb1-594e34fb2571/bond-cni-plugin/0.log" Apr 22 21:41:30.760074 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:30.760049 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb9x8_eb7a4eac-7e6d-40ce-abb1-594e34fb2571/routeoverride-cni/0.log" Apr 22 21:41:30.779312 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:30.779292 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb9x8_eb7a4eac-7e6d-40ce-abb1-594e34fb2571/whereabouts-cni-bincopy/0.log" Apr 22 21:41:30.798074 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:30.798054 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bb9x8_eb7a4eac-7e6d-40ce-abb1-594e34fb2571/whereabouts-cni/0.log" Apr 22 21:41:30.967847 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:30.967770 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-l48h6_32a5e549-f5ac-4611-99cf-e4b2fcd750db/kube-multus/0.log" Apr 22 21:41:31.130095 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:31.130071 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rpz8w_f4126f8f-7b88-4c50-82f3-3a91a3388519/network-metrics-daemon/0.log" Apr 22 21:41:31.148855 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:31.148830 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rpz8w_f4126f8f-7b88-4c50-82f3-3a91a3388519/kube-rbac-proxy/0.log" Apr 22 21:41:31.916526 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:31.916499 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-controller/0.log" Apr 22 21:41:31.934976 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:31.934954 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/0.log" Apr 22 21:41:31.944828 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:31.944804 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovn-acl-logging/1.log" Apr 22 21:41:31.961711 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:31.961690 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/kube-rbac-proxy-node/0.log" Apr 22 21:41:31.982923 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:31.982900 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 21:41:32.002009 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:32.001986 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/northd/0.log" Apr 22 21:41:32.020243 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:32.020219 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/nbdb/0.log" Apr 22 21:41:32.038539 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:32.038520 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/sbdb/0.log" Apr 22 21:41:32.129984 ip-10-0-130-19 kubenswrapper[2570]: I0422 21:41:32.129918 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42mgf_d3a676b5-93c4-4a35-9feb-bcfdb41df40e/ovnkube-controller/0.log"